CN111263211B - Method for caching video data and terminal equipment - Google Patents

Method for caching video data and terminal equipment Download PDF

Info

Publication number
CN111263211B
CN111263211B CN201811465873.4A CN201811465873A CN111263211B CN 111263211 B CN111263211 B CN 111263211B CN 201811465873 A CN201811465873 A CN 201811465873A CN 111263211 B CN111263211 B CN 111263211B
Authority
CN
China
Prior art keywords
video
target
file
cache file
time period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811465873.4A
Other languages
Chinese (zh)
Other versions
CN111263211A (en
Inventor
邹鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Weibo Technology Co ltd
Original Assignee
Shenzhen Weibo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Weibo Technology Co ltd filed Critical Shenzhen Weibo Technology Co ltd
Priority to CN201811465873.4A priority Critical patent/CN111263211B/en
Publication of CN111263211A publication Critical patent/CN111263211A/en
Application granted granted Critical
Publication of CN111263211B publication Critical patent/CN111263211B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4331Caching operations, e.g. of an advertisement for later insertion during playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The invention is suitable for the technical field of computers, and provides a method for caching video data and a terminal device, wherein the method comprises the following steps: when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material; detecting whether a cache file of the target video material needs to be generated or not based on the attribute information; when a cache file of the target video material needs to be generated, acquiring a video time period to be cached selected by a user; generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data. According to the scheme, the cache file and the index file of the video data corresponding to the video time period to be cached selected by the user are generated, so that segmented caching is realized, the video data corresponding to the cached cache file can be edited, and the video editing efficiency is improved.

Description

Method for caching video data and terminal equipment
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a method for caching video data and terminal equipment.
Background
Video editing software processes video materials with increasingly larger sizes, and video formats of the video materials are increasing from Standard Definition (SD) to High Definition (HD), from HD to Full High Definition (FHD), and then from Full HD to 4K ultra High Definition, which is currently the mainstream.
In the face of large-size materials, video editing software is increasingly difficult to process real-time rendering preview of videos in editing. Because the terminal equipment cannot support real-time editing and previewing of large-size video materials and is blocked, an agent cache file is generated by a cache agent in the video editing process to achieve the purpose of real-time previewing. However, the caching agent does not support segmented caching, and the current video editing method usually needs to cache the whole video file to edit the video file, so that the video editing efficiency is low.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method for caching video data and a terminal device, so as to solve the problem that, in the prior art, since a caching agent does not support segmented caching, the existing video editing method usually needs to cache an entire video file to edit the video file, and the video editing efficiency is low.
A first aspect of an embodiment of the present invention provides a method for caching video data, including:
when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material;
detecting whether a cache file of the target video material needs to be generated or not based on the attribute information;
when a cache file of the target video material needs to be generated, acquiring a video time period to be cached selected by a user;
generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
A second aspect of an embodiment of the present invention provides a terminal device, including:
the video editing device comprises a first acquisition unit, a second acquisition unit and a processing unit, wherein the first acquisition unit is used for acquiring attribute information of a target video material to be edited when a video editing application is detected to load the target video material;
the detection unit is used for detecting whether a cache file of the target video material needs to be generated or not based on the attribute information;
the second acquisition unit is used for acquiring the video time period to be cached selected by the user when the cache file of the target video material needs to be generated;
the cache unit is used for generating a cache file and an index file of the target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
A third aspect of an embodiment of the present invention provides a terminal device: comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material;
detecting whether a cache file of the target video material needs to be generated or not based on the attribute information;
when a cache file of the target video material needs to be generated, acquiring a video time period to be cached selected by a user;
generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of:
when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material;
detecting whether a cache file of the target video material needs to be generated or not based on the attribute information;
when a cache file of the target video material needs to be generated, acquiring a video time period to be cached selected by a user;
generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
when the situation that a user imports a target video material to be edited into a video editing application and confirms that a cache file of the target video material needs to be generated based on attribute information of the target video material is detected, a cache file and an index file corresponding to a video time period are generated by acquiring the video time period to be cached selected by the user and based on target video data corresponding to the video time period, so that segmented caching is realized, the terminal equipment can edit video data corresponding to the cached cache file, and the video editing efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a method for caching video data according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating an implementation of a method for caching video data according to another embodiment of the present invention;
fig. 3 is a schematic diagram of a terminal device provided in an embodiment of the present invention;
fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Referring to fig. 1, fig. 1 is a schematic view illustrating an implementation flow of a method for caching video data according to an embodiment of the present invention, where an execution main body of the method for caching video data according to the embodiment is a terminal device, a video editing application is installed in the terminal device, and the terminal device includes, but is not limited to, a tablet computer, a notebook computer, a desktop computer, and other terminals. The method for buffering video data as shown in the figure can comprise the following steps:
s101: when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material.
When a user needs to edit a video material, the user can start an installed video editing application in the terminal device, and load a target video material to be edited through an interactive interface of the video editing application. The video material is a video file. The video editing application is a preset video editing application.
The method comprises the steps that when the terminal equipment detects that a target video material to be edited is loaded by a video editing application, attribute information of the target video material is obtained. The attribute information of the target video material, including but not limited to the key frame interval or the sharpness index, is used to determine whether a cache file of the target video material needs to be generated currently. The key frame interval refers to the number of frames from a first key frame to a second key frame, the first key frame being adjacent to the second key frame. The key frame may be an I-frame, which is an important frame in inter-frame compression coding and is a full-frame compressed coded frame. When decoding, the data of I frame can be used to reconstruct complete image, and I frame can be generated without referring to other pictures.
S102: and detecting whether a cache file of the target video material needs to be generated or not based on the attribute information.
Further, when the attribute information includes the key frame interval, S102 specifically is: detecting whether a cache file of the target video material needs to be generated or not based on the key frame interval; and when the key frame interval is greater than or equal to a preset interval threshold, judging that a cache file of the target video material needs to be generated.
The preset interval threshold may be 10 frames, but is not limited thereto, and may be specifically set according to the actual situation, so as to be able to quickly locate the key frame, which is not limited herein.
Further, when the attribute information includes the sharpness index, S102 specifically is: detecting a cache file of the target video material to be generated based on the definition index; and when the value of the definition index is greater than or equal to a preset definition threshold value, judging that a cache file of the target video material needs to be generated.
The preset definition threshold may be 720 Progressive scanning (P), 1080P, or larger than 1080P, and may be specifically set according to actual needs, which is not limited herein.
The video format of the 720P video material is high definition, and the video format of the 1080P video material is full high definition.
In this embodiment, when the terminal device determines that the video format of the target video material is high definition (720P), full high definition (1080P), or ultra high definition (e.g., 4K or 8K) based on the definition index, the terminal device may reduce the occurrence of picture blockage in order to improve the smoothness of the picture of the video material.
When the detection result is that the cache file of the target video material needs to be generated currently, S103 is executed; and when the detection result is that the cache file of the target video material does not need to be generated at present, ending the flow of caching the video data, and editing the source video data in the target video material based on the editing instruction when the editing instruction is detected.
S103: and when the cache file of the target video material needs to be generated, acquiring the video time period to be cached selected by the user.
And when the detection result is that the cache file of the target video material is required to be generated currently, detecting whether a user triggers an instruction for selecting the video time period to be cached through an interactive interface, and acquiring the video time period to be cached selected by the user when the instruction is detected. The instruction may be triggered by a touch operation input through a touch screen, or may be triggered by a "progress bar" in an interactive interface, which is not limited herein.
The video time period to be cached is used for indicating the terminal equipment to cache the corresponding time periodVideo data. For example, the video period to be buffered is [ t ]1,t2]In Time, it is identified that the terminal device needs to cache a Presentation Time Stamp (PTS) as t1~t2Video frames in between.
It will be appreciated that the user may select a number of video time periods to be cached.
S104: generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
The terminal equipment acquires target video data corresponding to the video time period, starts an encoder to encode the target video data, and generates a cache file of the target video data corresponding to the video time period so as to cache the video data corresponding to the video time period; establishing an index file corresponding to the target video data, and adding the PTS of each video frame and the position of each video frame in the cache file in the index file; and then, storing the cache file and the index file corresponding to the video time period in an associated manner. The index file is used to record the display time stamp of each buffered video frame contained in the target video data. The display time stamp is used to tell the player when to display the video data for this frame.
It can be understood that the terminal device may also assign a unique index number to the index file, and establish an association relationship between the unique index number and the unique identifier of the cache file, so as to associate the index file and the cache file corresponding to the video time period. The index number of the index file and the identifier of the cache file may be their corresponding video time periods.
The frame rate of the cache file of the target video data is the same as the frame rate corresponding to the target video data. In this embodiment, the method for encoding the target video data by the encoder is the prior art, and is not described herein again.
It can be understood that the value of the sharpness index of the cache file may be lower than that of the target video data, so as to save the storage space occupied by the cache file.
Further, the terminal device may buffer the video frame played in the foreground, and S104 specifically includes S1041 to S1044, which are specifically as follows:
s1041: and acquiring the running state of the video editing application.
The running state comprises foreground running or background running.
S1042: and when the running state is foreground running, acquiring the decoded video frame.
The decoded video frame refers to a decoded video frame, and the decoded video frame includes a video frame played in the foreground. The number of decoded video frames may be one frame or at least two frames. When the number of decoded video frames is two, the decoded video frames may be continuous or discrete, and is not limited herein.
S1043: generating a cache file of the decoded video frame when the display timestamp of the decoded video frame belongs to the video time period.
And when the terminal equipment confirms that the display time stamp of the decoded video frame belongs to the acquired video time period, starting an encoder to encode the decoded video frame, and generating a cache file of the decoded video frame.
It can be understood that when the display timestamp of the decoded video frame does not belong to the video time period to be cached selected by the user, the decoded video frame is determined not to need to be cached, and the terminal device does not generate a cache file of the decoded video frame.
S1044: and when the index file of the video time period is not found, generating an index file corresponding to the cache file of the decoded video frame based on the display time stamp of the decoded video frame.
The terminal equipment searches the index file corresponding to the video time period from the storage area for storing the cache file, so as to determine whether the index file corresponding to the video time period is established. And when the index file corresponding to the video time period is not found, judging that the index file corresponding to the video time period is not established by the terminal equipment. And the terminal equipment generates an index file corresponding to the cache file of the decoded video frame based on the display time stamp of the decoded video frame and the position of each decoded video frame in the cache file of the decoded video frame. Specifically, the terminal device may create an index file corresponding to the cache file of the decoded video frames, and store the display timestamp of each decoded video frame and the position association of each decoded video frame in the cache file of the decoded video frames to the index file corresponding to the first cache file.
Further, after S1043, S1045 may be further included: when the index file of the video time period is found, writing the display time stamp of the decoded video frame and the position information of the decoded video frame in the cache file of the decoded video frame into the found index file.
When finding the index file corresponding to the video time period, the terminal device judges that the index file corresponding to the video time period is established previously, and stores the display timestamp of each decoded video frame and the position of each decoded video frame in the first cache file in association with the index file corresponding to the video time period.
For example, when the user selects a video time period to be cached, the terminal device detects that the user pulls the progress bar forward in the video time period to edit the decoded video frame corresponding to the current time in the process of caching the video data of the video time period and establishing the index file corresponding to the video time period, at this time, the terminal device does not have time to cache the video data corresponding to the current time but has established the index file of the video time period, and therefore, the terminal device can find the index file of the video time period.
Further, after S1041, S1046 may be further included: and when the running state is background running, generating a cache file and an index file of the target video data corresponding to the video time period.
When detecting that the running state of the video editing software is background running, the terminal equipment acquires target video data corresponding to a video time period to be cached selected by a user, starts an encoder to encode the target video data, generates a cache file of the target video data corresponding to the video time period, and caches the video data corresponding to the video time period; establishing an index file corresponding to the target video data, and adding the PTS of each video frame and the position of each video frame in the cache file in the index file; and storing the cache file and the index file corresponding to the video time period in an associated manner.
According to the scheme, when the situation that a user imports the target video material to be edited into the video editing application and confirms that the cache file of the target video material needs to be generated based on the attribute information of the target video material is detected, the cache file and the index file corresponding to the video time period are generated by acquiring the video time period to be cached selected by the user and based on the target video data corresponding to the video time period, so that segmented caching is realized, the terminal equipment can edit the video data corresponding to the cached cache file, and the video editing efficiency is improved.
The terminal device can cache the decoded video frame acquired by the video editing application in foreground operation, cache the decoded video frame, and also acquire the video frame to be cached acquired by the video editing application in background operation, thereby further improving the caching efficiency.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating an implementation of a method for caching video data according to another embodiment of the present invention, where the difference between the present embodiment and the previous embodiment is that the present embodiment further includes S205 to S206. In this embodiment, S201 to S204 are the same as S101 to S104 in the previous embodiment, and please refer to the related description of S101 to S104 in the previous embodiment, which is not described herein again. S205-S206 are specifically as follows:
s205: acquiring a video editing instruction; the video editing instructions include a target display timestamp for a video frame to be edited.
The video editing instruction can be input by a user or triggered by a touch gesture input through a touch screen.
S206: and when the target display timestamp is found in the target index file, acquiring a target video frame corresponding to the target display timestamp from a target cache file corresponding to the target index file, and editing the target video frame according to the editing instruction.
The terminal equipment traverses a cache file in a storage area for storing the cache file based on a target display timestamp of a video frame to be edited contained in a video editing instruction, searches whether the cache file records the target display timestamp of the video frame to be edited, identifies the index file as a target index file when the target display timestamp is searched from any index file, acquires the target cache file corresponding to the target index file based on the association relationship between the index file and the cache file, acquires the target video frame corresponding to the target display timestamp from the target cache file, and edits the target video frame according to the acquired video editing instruction.
The target cache file is any cache file, and may be the cache file generated in S104 in the previous embodiment, or may be the cache file generated in S1043 or S1046.
According to the scheme, the terminal equipment can edit the video frames cached in the cache file corresponding to any video time period, the video frames can be edited without waiting for the video data of the whole cached video material, the waiting time is shortened, and the video editing efficiency can be improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Referring to fig. 3, fig. 3 is a terminal device according to an embodiment of the present invention, where the terminal device includes units for executing steps in the embodiment corresponding to fig. 1 or fig. 2. Please refer to fig. 1 and fig. 2 for the corresponding embodiments. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 3, the terminal device 3 includes:
a first obtaining unit 310, configured to, when it is detected that a video editing application loads a target video material to be edited, obtain attribute information of the target video material;
a detecting unit 320, configured to detect whether a cache file of the target video material needs to be generated based on the attribute information;
the second obtaining unit 330 is configured to obtain a video time period to be cached, which is selected by a user, when a cache file of the target video material needs to be generated;
a cache unit 340, configured to generate a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
Further, the attribute information includes a key frame interval, and the detecting unit 320 is specifically configured to: detecting whether a cache file of the target video material needs to be generated or not based on the key frame interval; and when the key frame interval is greater than or equal to a preset interval threshold, judging that a cache file of the target video material needs to be generated.
Further, the attribute information includes a definition index, and the detecting unit 320 is specifically configured to: detecting whether a cache file of the target video material needs to be generated or not based on the definition index; and when the value of the definition index is greater than or equal to a preset definition threshold value, judging that a cache file of the target video material needs to be generated.
Further, the buffer unit 340 includes:
a state acquisition unit configured to acquire an operating state of the video editing application;
a video frame acquiring unit, configured to acquire a decoded video frame when the running state is foreground running;
a first generating unit, configured to generate a cache file of the decoded video frame when a display timestamp of the decoded video frame belongs to the video time period;
and the second generating unit is used for generating an index file corresponding to the cache file of the decoded video frame based on the display timestamp of the decoded video frame when the index file of the video time slot is not found.
Further, the terminal device further includes:
a third generating unit, configured to, when an index file of the video time period is found, write the display timestamp of the decoded video frame and the location information of the decoded video frame in the cache file of the decoded video frame into the found index file.
Further, the terminal device further includes:
and the fourth generating unit is used for generating a cache file and an index file of the target video data corresponding to the video time period when the running state acquired by the state acquiring unit is background running.
Further, the terminal device may further include:
the instruction acquisition unit is used for acquiring a video editing instruction; the video editing instruction comprises a target display timestamp of a video frame to be edited;
and the editing unit is used for acquiring a target video frame corresponding to the target display timestamp from a target cache file corresponding to the target index file when the target display timestamp is found in the target index file, and editing the target video frame according to the editing instruction.
Fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 4, the terminal device 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40, such as a program for buffering video data. The processor 40, when executing the computer program 42, implements the steps in the above-described embodiments of the method for buffering video data, such as the steps 101 to 102 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the units in the device embodiments, such as the functions of the modules 310 to 340 shown in fig. 3.
Illustratively, the computer program 42 may be divided into one or more modules/units, which are stored in the memory 41 and executed by the processor 40 to carry out the invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into an acquisition unit and a configuration unit, each unit having the specific functions as described above.
The terminal device 4 may include, but is not limited to, a processor 40 and a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of terminal device 4 and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., terminal device 4 may also include input and output terminal devices, network access terminal devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage terminal device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit of the terminal device 4 and an external storage terminal device. The memory 41 is used for storing the computer program and other programs and data required by the terminal device 4. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed terminal device and method may be implemented in other ways. For example, the above-described terminal device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical function division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (8)

1. A method for buffering video data, comprising:
when detecting that a video editing application loads a target video material to be edited, acquiring attribute information of the target video material, wherein the attribute information comprises a key frame interval;
detecting whether a cache file of the target video material needs to be generated or not based on the key frame interval; when the key frame interval is greater than or equal to a preset interval threshold value, judging that a cache file of the target video material needs to be generated;
when a cache file of the target video material needs to be generated, acquiring a video time period to be cached selected by a user;
generating a cache file and an index file of target video data corresponding to the video time period; the index file is used for recording display time stamps of video frames contained in the target video data.
2. The method for caching video data according to claim 1, wherein the generating the cache file of the target video data corresponding to the video time period and the index file comprises:
acquiring the running state of the video editing application;
when the running state is foreground running, acquiring a decoded video frame;
when the display time stamp of the decoded video frame belongs to the video time period, generating a cache file of the decoded video frame;
and when the index file of the video time period is not found, generating an index file corresponding to the cache file of the decoded video frame based on the display time stamp of the decoded video frame.
3. The method for buffering video data according to claim 2, further comprising:
when the index file of the video time period is found, writing the display time stamp of the decoded video frame and the position information of the decoded video frame in the cache file of the decoded video frame into the found index file.
4. The method for caching video data according to claim 2, wherein after obtaining the running state of the video editing application, further comprising:
and when the running state is background running, generating a cache file and an index file of the target video data corresponding to the video time period.
5. The method for caching video data according to any one of claims 1 to 4, wherein the generating a cache file and an index file value of the target video data corresponding to the video time period further comprises:
acquiring a video editing instruction; the video editing instruction comprises a target display timestamp of a video frame to be edited;
and when the target display timestamp is found in the target index file, acquiring a target video frame corresponding to the target display timestamp from a target cache file corresponding to the target index file, and editing the target video frame according to the editing instruction.
6. A terminal device, comprising:
the video editing device comprises a first obtaining unit, a second obtaining unit and a processing unit, wherein the first obtaining unit is used for obtaining attribute information of a target video material to be edited when a video editing application is detected to load the target video material, and the attribute information comprises a key frame interval;
the detection unit is used for detecting whether a cache file of the target video material needs to be generated or not based on the key frame interval; when the key frame interval is greater than or equal to a preset interval threshold value, judging that a cache file of the target video material needs to be generated;
the second acquisition unit is used for acquiring the video time period to be cached selected by the user when the cache file of the target video material needs to be generated;
the cache unit is used for generating a cache file and an index file of the target video data corresponding to the video time period;
the index file is used for recording display time stamps of video frames contained in the target video data.
7. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201811465873.4A 2018-12-03 2018-12-03 Method for caching video data and terminal equipment Active CN111263211B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811465873.4A CN111263211B (en) 2018-12-03 2018-12-03 Method for caching video data and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811465873.4A CN111263211B (en) 2018-12-03 2018-12-03 Method for caching video data and terminal equipment

Publications (2)

Publication Number Publication Date
CN111263211A CN111263211A (en) 2020-06-09
CN111263211B true CN111263211B (en) 2022-02-08

Family

ID=70951982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811465873.4A Active CN111263211B (en) 2018-12-03 2018-12-03 Method for caching video data and terminal equipment

Country Status (1)

Country Link
CN (1) CN111263211B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901696B (en) * 2020-07-31 2022-04-15 杭州当虹科技股份有限公司 Real-time recording and strip-disassembling system based on hls technology by using preloading mode
CN112819924A (en) * 2021-01-27 2021-05-18 武汉悦学帮网络技术有限公司 Picture editing method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101635848A (en) * 2008-07-22 2010-01-27 北大方正集团有限公司 Method and device for editing video file
CN101946518A (en) * 2007-12-28 2011-01-12 诺基亚公司 Methods, apparatuses, and computer program products for adaptive synchronized decoding of digital video
CN102883217A (en) * 2012-09-26 2013-01-16 华为技术有限公司 Method and device for controlling video playing
CN105451084A (en) * 2015-12-08 2016-03-30 深圳市福云明网络科技有限公司 Method and apparatus for editing remotely and downloading video in camera
CN105519095A (en) * 2014-12-14 2016-04-20 深圳市大疆创新科技有限公司 Video processing processing method, apparatus and playing device
CN106791933A (en) * 2017-01-20 2017-05-31 杭州当虹科技有限公司 The method and system of the online quick editor's video based on web terminal
CN108924592A (en) * 2018-08-06 2018-11-30 青岛海信传媒网络技术有限公司 A kind of method and apparatus of video processing

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8990214B2 (en) * 2001-06-27 2015-03-24 Verizon Patent And Licensing Inc. Method and system for providing distributed editing and storage of digital media over a network
CN103503070B (en) * 2011-04-28 2016-11-16 松下知识产权经营株式会社 Record medium, transcriber, recording equipment, coded method and the coding/decoding method relevant with high image quality
US9564172B2 (en) * 2014-07-14 2017-02-07 NFL Enterprises LLC Video replay systems and methods

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101946518A (en) * 2007-12-28 2011-01-12 诺基亚公司 Methods, apparatuses, and computer program products for adaptive synchronized decoding of digital video
CN101635848A (en) * 2008-07-22 2010-01-27 北大方正集团有限公司 Method and device for editing video file
CN102883217A (en) * 2012-09-26 2013-01-16 华为技术有限公司 Method and device for controlling video playing
CN105519095A (en) * 2014-12-14 2016-04-20 深圳市大疆创新科技有限公司 Video processing processing method, apparatus and playing device
CN105451084A (en) * 2015-12-08 2016-03-30 深圳市福云明网络科技有限公司 Method and apparatus for editing remotely and downloading video in camera
CN106791933A (en) * 2017-01-20 2017-05-31 杭州当虹科技有限公司 The method and system of the online quick editor's video based on web terminal
CN108924592A (en) * 2018-08-06 2018-11-30 青岛海信传媒网络技术有限公司 A kind of method and apparatus of video processing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Android的移动视频监控软件的设计与实现;陈明珍;《中国优秀硕士学位论文全文数据库(信息科技辑)》;20160215;全文 *

Also Published As

Publication number Publication date
CN111263211A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
CN112291627B (en) Video editing method and device, mobile terminal and storage medium
CN108810622B (en) Video frame extraction method and device, computer readable medium and electronic equipment
AU2014373838B2 (en) Content-adaptive chunking for distributed transcoding
US10834424B2 (en) Method and device for compressing image, and electronic device
KR101952260B1 (en) Video display terminal and method for displaying a plurality of video thumbnail simultaneously
CN104519401A (en) Video division point acquiring method and equipment
WO2017202175A1 (en) Method and device for video compression and electronic device
CN111263211B (en) Method for caching video data and terminal equipment
KR102147633B1 (en) Method and apparatus for decoding variable length coded files
JPH11284948A (en) Thumbnail generating method and thumbnail display method
CN107370977B (en) Method, equipment and storage medium for adding commentary in detection video
CN116132719A (en) Video processing method, device, electronic equipment and readable storage medium
KR102557384B1 (en) Electronic apparatus and control method thereof
CN113779304A (en) Method and device for detecting infringement video
CN108335706B (en) Seamless playing method of multimedia file, terminal equipment and storage medium
CN108235144B (en) Playing content obtaining method and device and computing equipment
CN111147954A (en) Thumbnail extraction method and device
JP2015080035A (en) Image processing device, image processing method and program
US11799943B2 (en) Method and apparatus for supporting preroll and midroll during media streaming and playback
US11973820B2 (en) Method and apparatus for mpeg dash to support preroll and midroll content during media playback
US20230103367A1 (en) Method and apparatus for mpeg dash to support preroll and midroll content during media playback
US11588870B2 (en) W3C media extensions for processing DASH and CMAF inband events along with media using process@append and process@play mode
KR100487330B1 (en) Apparatus for generating thumbnail image of digital video
CN114025162B (en) Entropy decoding method, medium, program product, and electronic device
JP3956288B2 (en) Video information indexing support apparatus, program, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant