CN114900715A - Video data processing method, terminal and storage medium - Google Patents

Video data processing method, terminal and storage medium Download PDF

Info

Publication number
CN114900715A
CN114900715A CN202210454162.7A CN202210454162A CN114900715A CN 114900715 A CN114900715 A CN 114900715A CN 202210454162 A CN202210454162 A CN 202210454162A CN 114900715 A CN114900715 A CN 114900715A
Authority
CN
China
Prior art keywords
frame data
client
data set
video
cloud server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210454162.7A
Other languages
Chinese (zh)
Inventor
邓至亨
涂承杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuanxiang Information Technology Co ltd
Original Assignee
Shenzhen Yuanxiang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yuanxiang Information Technology Co ltd filed Critical Shenzhen Yuanxiang Information Technology Co ltd
Priority to CN202210454162.7A priority Critical patent/CN114900715A/en
Publication of CN114900715A publication Critical patent/CN114900715A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • H04N21/2662Controlling the complexity of the video stream, e.g. by scaling the resolution or bitrate of the video stream based on the client capabilities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8456Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a video data processing method, a terminal and a storage medium, and belongs to the technical field of multimedia. The method comprises the following steps: the cloud server sends a first frame data set to the client, wherein the first frame data set is located in a first video file; the cloud server sends a storage instruction to the client, wherein the storage instruction represents storage of the first frame data set; and when a second frame data set in the second video file is the same as the first frame data set, the cloud server sends a calling instruction to the client, wherein the calling instruction represents calling of the first frame data set from the client, so that the client plays the second video file according to the first frame data set. According to the technical scheme, the computing resource and bandwidth resource for encoding and transmitting the second frame data set are saved, and the video playing has higher fluency under the condition of the same network resource.

Description

Video data processing method, terminal and storage medium
Technical Field
The present application relates to the field of multimedia technologies, and in particular, to a method, a terminal, and a storage medium for processing video data.
Background
With the development of the internet, transmission through the network becomes an important transmission path of multimedia. The network transmission mode comprises downloading and streaming transmission, and the streaming transmission is deeply favored by users by the characteristics of downloading and playing. When the network is unstable, if the video data is not successfully transmitted through the network at a certain moment, the playing is paused for many times, and the user experience of watching is not good. Through video buffering, a section of video behind the current playing point is received and locally stored when the network environment is good, and a section of video pre-stored locally is played when the network environment is poor and video data cannot be received, so that the fluency of playing video pictures through buffering under the condition of unstable network is improved.
With the development of computer video processing technology, the content and form of multimedia are more and more rich and varied. Real-time interactive video is a new form of video. Different from the traditional video, the real-time interactive video is not a predetermined complete video, but a video generated at any time along with the operation of a user, and is characterized by continuously changing along with the operation of the user. Therefore, the real-time interactive video cannot improve the fluency during playing by storing a section of video behind the playing point in advance when the network environment is poor, and can only be transmitted in real time.
However, in the real-time transmission process of the real-time interactive video, a transmission network with a larger bandwidth and higher stability is required, which may cause the transmission of the real-time interactive video to occupy more network resources, resulting in higher playing cost of the real-time interactive video.
Disclosure of Invention
The embodiment of the application mainly aims to provide a video data processing method, a terminal and a storage medium, and aims to reduce bandwidth resources and other network resources occupied by real-time interactive video transmission and playing and optimize playing cost of real-time interactive periodic videos.
In order to achieve the above object, an embodiment of the present application provides a method for processing video data, where the method includes the following steps: the method comprises the steps that a cloud server sends a first frame data set to a client, wherein the first frame data set is located in a first video file; the cloud server sends a storage instruction to the client, wherein the storage instruction represents storage of the first frame data set; when a second frame data set in a second video file is the same as the first frame data set, the cloud server sends a calling instruction to the client, wherein the calling instruction represents calling of the first frame data set from the client, so that the client plays the second video file according to the first frame data set.
In order to achieve the above object, an embodiment of the present application further provides a method for processing video data, where the method includes the following steps:
a client receives a first frame data set sent by a cloud server, wherein the first frame data set is located in a first video file; the client receives a storage instruction sent by the cloud server; the client stores the first frame data set locally according to the storage instruction; the client receives a calling instruction sent by the cloud server; the client calls the first frame data set according to the calling instruction; and the client plays the second video file according to the first frame data set.
In order to achieve the above object, an embodiment of the present application further provides a cloud server, where the cloud server includes a memory, a processor, a program stored on the memory and executable on the processor, and a data bus for implementing connection communication between the processor and the memory, and the program implements the steps of the foregoing method when executed by the processor.
In order to achieve the above object, an embodiment of the present application further provides a client, where the client includes a memory, a processor, a program stored in the memory and executable on the processor, and a data bus for implementing connection communication between the processor and the memory, and the program implements the steps of the foregoing method when executed by the processor.
In order to achieve the above object, an embodiment of the present application further provides a video data processing system, where the video data processing system includes the cloud server and the client.
To achieve the above object, the present application provides a storage medium for a computer-readable storage, the storage medium storing one or more programs, the one or more programs being executable by one or more processors to implement the steps of the aforementioned method.
According to the video data processing method, the terminal and the storage medium, the storage instruction is sent to the client to indicate the client to store the reusable first frame data set, when the second frame data set identical to the first frame data set appears again, the second frame data set does not need to be encoded and transmitted again, but the client is indicated to call the pre-stored first frame data set and play the second video file according to the first frame data set, so that the computing resources and bandwidth resources for encoding and transmitting the second frame data set are saved, and the video playing has higher fluency under the condition of identical network resources.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an architecture of a cloud video processing system according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart illustrating a step of a video data processing method according to an embodiment of the present application;
fig. 3 is a flowchart illustrating another step of a method for processing video data according to an embodiment of the present application;
fig. 4 is a flowchart illustrating another step of a method for processing video data according to an embodiment of the present application;
fig. 5 is a flowchart illustrating another step of a method for processing video data according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating encoding of a real-time interactive video according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a cloud server according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a client according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
With the development of the internet, transmission through the network becomes an important transmission path of multimedia. The network transmission mode comprises downloading and streaming transmission, and the streaming transmission is deeply favored by users by the characteristics of downloading and playing. When the network is unstable, if the video data is not successfully transmitted through the network at a certain moment, the playing is paused for many times, and the user experience of watching is not good. Through video buffering, when a small amount of video data is received, the video data is not played immediately, but is played after the video data reaching the target amount is locally stored in the client, and the pause times can be reduced, so that the playing is smoother.
As with conventional video, real-time interactive video also requires video buffering when the network is unstable to improve the smoothness of playback.
However, since the real-time interactive video is continuously changed along with the operation of the user, when the video data generated at the receiving time is stored until the playing time and played again, the playing may be different from the video data generated at the playing time, which may cause an error in playing. Therefore, the real-time interactive video cannot improve the viewing fluency through the traditional buffering manner, and those skilled in the art generally transmit the real-time interactive video in the conditions of high bandwidth resources and high network stability, which results in high playing cost of the real-time interactive video.
In order to solve the problems that a transmission network with larger bandwidth and higher stability is needed in the real-time transmission process of a real-time interactive video, so that more network resources are occupied and the playing cost is higher, the application provides a preview method, a terminal and a storage medium of edited data.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating an architecture of a cloud video processing system according to an embodiment of the present disclosure.
The cloud video processing system comprises a cloud server 101 and a client 102, wherein the cloud server 101 is used for acquiring a video file, analyzing the video file, encoding the video file according to certain encoding parameters, and transmitting encoded video encoding data.
The client 102 is configured to receive video encoding data sent by the cloud server 101, and decode the video encoding data to obtain a decoded video image; the video processing device is configured to send a response message to the cloud server 101, and is configured to process the video image and the video encoding data and then play the video.
Based on the cloud video processing system shown in fig. 1, the embodiment of the application provides a video data processing method.
Referring to fig. 2, fig. 2 is a flowchart illustrating a video data processing method according to an embodiment of the present disclosure.
S201, the cloud server sends a first frame data set to the client, and the first frame data set is located in a first video file.
The cloud server analyzes the first video file to be played, and acquires a first frame data set to be transmitted to the client from the first video file, wherein the first frame data set may include one or more frames of data. The form of the data of one or more frames is different at each stage, but the data content it represents refers to the one or more frames. In this step, the data of one or more frames in the first frame data set is encoded data obtained by encoding.
The cloud server sends a first set of frame data representing data of one or more frames in the first video file to the client.
S202, the cloud server sends a storage instruction to the client, and the storage instruction represents storage of the first frame data set.
After the cloud server sends the first video file to the client, according to the result of analyzing the first video file, a storage instruction which indicates that the client needs to store the first frame data set is determined to be sent to the client.
And S203, when the second frame data set in the second video file is the same as the first frame data set, the cloud server sends a calling instruction to the client, wherein the calling instruction represents calling of the first frame data set from the client, so that the client plays the second video file according to the first frame data set.
The method comprises the steps that after the cloud server completes transmission of a first video file to a client and completes playing on the client, the cloud server determines a second video file to be played, when the cloud server determines that a second frame data set identical to a first frame data set in the first video file exists in the second video file, the cloud server does not encode and transmit the second frame data set any more, but sends a calling instruction to the client to instruct the client to call the first frame data set from a pre-stored local place, and the client plays the second video file according to the first frame data set.
It should be noted that, in step S201, the first frame data set sent by the cloud end server to the client may include data in various forms, for example, may include only key frame data, or may include both key frame data and non-key frame data, as shown in fig. 3 and fig. 4.
Fig. 3 is another flowchart illustrating a method for processing video data according to an embodiment of the present application, where the first frame data set only includes key frame data.
S301, the cloud server encodes the first frame in the first video file into key frame data.
The cloud server encodes a first frame in the first video file into key frame data by analyzing the first video file.
S302, the cloud server sends the key frame data to the client.
And the cloud server sends the key frame data which completes the coding of the first frame to the client.
And S303, the cloud server sends a storage instruction to the client, wherein the storage instruction represents storing the key frame data.
And the cloud server sends a storage instruction indicating to store the key frame data to the client.
And S304, when the second frame data set in the second video file is the same as the key frame data, the cloud server sends a calling instruction to the client, so that the client plays the second video file according to the key frame data.
Step S304 is similar to step S204 in the embodiment shown in fig. 2, and details thereof are not repeated here.
Fig. 4 is another flowchart illustrating a method for processing video data according to an embodiment of the present application, where the first frame data set includes both key frame data and non-key frame data.
S401, the cloud server encodes a first frame in the first video file into key frame data, and encodes a third frame in the first video file into non-key frame data.
S402, the cloud server sends the key frame data and the non-key frame data to the client.
And S403, the cloud server sends a storage instruction to the client, wherein the storage instruction represents storage of the key frame data and the non-key frame data.
And S404, when the second frame data set in the second video file is the same as the key frame data, the cloud server sends a calling instruction to the client, so that the client plays the second video file according to the key frame data and the non-key frame data.
It should be noted that, the embodiment of the present application also provides a method for processing video data at a client side, please refer to fig. 5.
S501, a client receives a first frame data set sent by a cloud server, and the first frame data set is located in a first video file.
The client receives a first frame data set which is sent from the cloud server and acquired according to the first video file.
It should be noted that, as described in the embodiment shown in fig. 3 and fig. 4, the first frame data set may only include the key frame data, or may include both the key frame data and the non-key frame data, and details thereof are not repeated here.
S502, the client receives a storage instruction sent by the cloud server.
The client receives a storage instruction sent by the cloud server, wherein the storage instruction indicates that the first frame data set needs to be stored locally at the client.
And S503, the client stores the first frame data set locally according to the storage instruction.
The client stores the first frame data set locally as instructed by the storage instruction.
S504, the client receives a calling instruction sent by the cloud server.
And S505, the client side calls the first frame data set from the local according to the calling instruction.
S506, the client plays the second video file according to the first frame data set.
With reference to the above description, an application scenario of the video data processing method in the embodiment of the present application is described below.
The video data processing method provided by the embodiment of the application can be applied to real-time interactive videos with periodically appearing pictures, namely periodic videos for short. The periodic video has more repeated picture frames, and the effect of reducing the playing cost of the periodic video can be achieved by buffering the high-reusability frame data. For periodic video, the encoding method shown in fig. 6 may be adopted:
fig. 6 is a schematic diagram of periodic video coding in an embodiment of the present application. The periodic video is abstracted into a circle, wherein one frame of picture of the video corresponds to one point on the circle, and the circle is divided into a plurality of segments equally. The frame common to two connected segments is named as a boundary frame (i.e. the frame corresponding to the point with short vertical line in the figure), and the rest frames are used as non-boundary frames. When video coding is carried out at the cloud, all boundary frames are specially controlled to be coded into key frames, all non-boundary frames are coded into non-key frames, and the non-key frames are non-key frames.
The cloud end starts to encode the video from a target picture of the periodic video, transmits the video stream to the front end, and the front end starts to play the video picture. If the initial target picture is a non-boundary frame, the frames are coded from the nearest boundary frame to the target picture frame, and the frames of the previous non-target picture are also specially marked, and the pictures cannot be displayed after the front end identifies the mark. And coding the boundary frame into a key frame during video coding, and marking the boundary frame, so that the client identifies the boundary frame and buffers the boundary frame in the client by using a key specified by the cloud. When the client receives the boundary frame needing buffering, the boundary frame is buffered to the local, and the buffered confirmation is replied to the background.
The periodic video dynamically adjusts the play speed according to the user input of real-time interaction. Each time a segment is played from one segment to the next, a boundary frame common to both segments must be traversed. The cloud code encodes the boundary frame as an I-frame (i.e., key frame). Since the video playback speed is determined by the user input, the boundary frame may not need to be actually displayed if the current video playback speed is fast. When the boundary frame does not need to be displayed, the frame is coded to the target picture frame from the nearest boundary frame, and the frames of the previous non-target pictures are also specially marked, and the pictures cannot be displayed after the front end identifies the mark.
And judging the boundary frame shared by the two segments, if the boundary frame is not buffered, encoding the boundary frame into a key frame during video encoding, and marking the boundary frame, so that the client identifies the boundary frame and buffers the boundary frame in the client by using a key specified by the cloud. And if the key frame is already buffered, sending the key corresponding to the boundary frame to the client, and enabling the client to obtain the locally buffered content.
In order to ensure that the cloud end accurately judges whether a certain boundary frame is buffered, the cloud end needs to maintain a boundary frame list buffered by the client end. When the client buffers the specific boundary frame, the client sends a buffered acknowledgement message to the cloud. Only when the cloud receives the acknowledgement message will the client be considered to have buffered the corresponding boundary frame. When a certain boundary frame confirmation message has network packet loss, namely the cloud end does not receive the buffered confirmation message of the client, the cloud end continuously sends the boundary frame when the video stream is sent continuously.
Non-key frame buffering. Periodic video, regardless of how real-time interaction affects, is periodic in its picture. Non-key frames can also be buffered, but a requirement is made when the video is generated at the cloud: the video playing is not played at any speed any more, but played at a plurality of set multiple speeds, such as double speed, triple speed and quadruple speed. And when the cloud end codes the video, coding the video according to a certain speed. In this scenario, the same non-key frame appears repeatedly at a high frequency and can still be buffered locally as a key frame. After the video is played for a period, if the client does not receive a speed change command, local key frames and non-key frames can be directly taken for buffering, and cloud coding is not needed again. This can significantly save cloud server overhead and bandwidth costs when in use.
The essence of the YUV buffering scheme is the balance between the memory usage of the client and the operation response speed of the user at the client. We provide two YUV buffering schemes:
key frame only YUV buffering: the decoding of the key frame is usually very expensive, and the YUV results of the key frame can be directly buffered for faster response to the user's operation, which can reduce the speed of the large workload of key frame decoding to the real-time interactive response.
Buffering of the entire picture: based on the scheme of buffering key frames and non-key frames simultaneously, YUV corresponding to all frames is cached locally, and after a period of playing, a client does not need to decode the same content for multiple times when the client autonomously controls the playing of a periodic video.
It should be noted that, as will be clear to those skilled in the art, for convenience and brevity of description, the specific working processes of the data migration apparatus and the modules described above may refer to corresponding processes in the foregoing data migration method embodiment, and are not described herein again.
Referring to fig. 7, an embodiment of the present application provides a cloud server, which includes a processor, a memory and a network interface connected by a system bus, where the memory may include a nonvolatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program includes program instructions that, when executed, cause a processor to perform any one of the methods of processing video data.
Referring to fig. 8, an embodiment of the present application provides a client, which includes a processor, a memory and a network interface connected by a system bus, where the memory may include a nonvolatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program includes program instructions that, when executed, cause a processor to perform any one of the methods of processing video data.
The processor is used for providing calculation and control capability and supporting the operation of the whole computer equipment.
The internal memory provides an environment for running a computer program in the non-volatile storage medium, which, when executed by the processor, causes the processor to perform any one of the video data processing methods.
The network interface is used for network communication, such as sending assigned tasks and the like. Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It should be understood that the Processor may be a Central Processing Unit (CPU), and the Processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Illustratively, in some embodiments, the processor is configured to execute a computer program stored in the memory to perform the steps of:
sending a first frame data set to a client, wherein the first frame data set is located in a first video file;
sending a storage instruction to the client, the storage instruction representing storing the first frame data set;
and when a second frame data set in a second video file is the same as the first frame data set, sending a calling instruction to the client, wherein the calling instruction represents calling the first frame data set from the client, so that the client plays the second video file according to the first frame data set.
In some embodiments, the processor is configured to execute a computer program stored in the memory to perform the steps of:
receiving a first frame data set sent by a cloud server, wherein the first frame data set is located in a first video file;
receiving a storage instruction sent by the cloud server;
storing the first frame data set locally according to the storage instruction;
receiving a calling instruction sent by the cloud server;
calling the first frame data set according to the calling instruction;
and playing the second video file according to the first frame data set.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement any video data processing method provided in the embodiment of the present application.
The computer-readable storage medium may be an internal storage unit of the computer device described in the foregoing embodiment, for example, a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for processing video data, the method comprising:
the method comprises the steps that a cloud server sends a first frame data set to a client, wherein the first frame data set is located in a first video file;
the cloud server sends a storage instruction to the client, wherein the storage instruction represents storage of the first frame data set;
when a second frame data set in a second video file is the same as the first frame data set, the cloud server sends a calling instruction to the client, wherein the calling instruction represents calling of the first frame data set from the client, so that the client plays the second video file according to the first frame data set.
2. The method of claim 1, wherein the first set of frame data comprises key frame data, wherein the first video file comprises a first frame, and wherein before the cloud server sends the first set of frame data to the client, the method further comprises:
and the cloud server encodes the first frame into key frame data.
3. The method of claim 2, wherein the first set of frame data comprises non-key frame data, wherein the first video file comprises a third frame, and wherein the method further comprises:
and the cloud server encodes the third frame into non-key frame data, and the non-key frame data and the key frame data have an association relation.
4. A method for processing video data, the method comprising:
a client receives a first frame data set sent by a cloud server, wherein the first frame data set is located in a first video file;
the client receives a storage instruction sent by the cloud server;
the client stores the first frame data set locally according to the storage instruction;
the client receives a calling instruction sent by the cloud server;
the client calls the first frame data set according to the calling instruction;
and the client plays the second video file according to the first frame data set.
5. The method of claim 4, wherein the first set of frame data comprises key frame data.
6. The method of claim 5, wherein the first set of frame data comprises non-key frame data.
7. Cloud server, characterized by a cloud server memory, a processor, a program stored on the memory and executable on the processor, and a data bus for enabling connection communication between the processor and the memory, the program, when executed by the processor, implementing the steps of the method for processing video data according to any one of claims 1 to 3.
8. A client, characterized in that the client comprises a memory, a processor, a program stored on the memory and executable on the processor, and a data bus for enabling a connection communication between the processor and the memory, the program, when executed by the processor, implementing the steps of the method for processing video data according to any one of claims 4 to 6.
9. A video data processing system comprises the cloud server and the client.
10. A storage medium for a computer-readable storage, characterized in that the storage medium stores one or more programs executable by one or more processors to implement the steps of the method for processing video data according to any one of claims 1 to 6.
CN202210454162.7A 2022-04-27 2022-04-27 Video data processing method, terminal and storage medium Pending CN114900715A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210454162.7A CN114900715A (en) 2022-04-27 2022-04-27 Video data processing method, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210454162.7A CN114900715A (en) 2022-04-27 2022-04-27 Video data processing method, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN114900715A true CN114900715A (en) 2022-08-12

Family

ID=82719367

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210454162.7A Pending CN114900715A (en) 2022-04-27 2022-04-27 Video data processing method, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114900715A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108307237A (en) * 2018-01-19 2018-07-20 西安万像电子科技有限公司 The transmission method of display data, device and system
CN110290402A (en) * 2019-07-31 2019-09-27 腾讯科技(深圳)有限公司 A kind of video code rate method of adjustment, device, server and storage medium
CN113786605A (en) * 2021-08-23 2021-12-14 咪咕文化科技有限公司 Video processing method, apparatus and computer readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108307237A (en) * 2018-01-19 2018-07-20 西安万像电子科技有限公司 The transmission method of display data, device and system
CN110290402A (en) * 2019-07-31 2019-09-27 腾讯科技(深圳)有限公司 A kind of video code rate method of adjustment, device, server and storage medium
CN113786605A (en) * 2021-08-23 2021-12-14 咪咕文化科技有限公司 Video processing method, apparatus and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN110636346B (en) Code rate self-adaptive switching method and device, electronic equipment and storage medium
CN112437122B (en) Communication method, communication device, computer readable medium and electronic equipment
WO2017219896A1 (en) Method and device for transmitting video stream
KR20180031547A (en) Method and apparatus for adaptively providing multiple bit rate stream media in server
US10516712B2 (en) Streaming media data transmission method, client and server
CN110784740A (en) Video processing method, device, server and readable storage medium
US11863841B2 (en) Video playing control method and system
CN111010603A (en) Video caching and forwarding processing method and device
CN112866746A (en) Multi-path streaming cloud game control method, device, equipment and storage medium
CN111343503B (en) Video transcoding method and device, electronic equipment and storage medium
US10575065B2 (en) Message sending method and device, code stream processing method and device
CN113079386B (en) Video online playing method and device, electronic equipment and storage medium
US20230388526A1 (en) Image processing method and apparatus, computer device, storage medium and program product
CN115278289B (en) Cloud application rendering video frame processing method and device
CN114900715A (en) Video data processing method, terminal and storage medium
CN109218809B (en) Streaming media playing method and device
CN115767149A (en) Video data transmission method and device
CN115514960A (en) Video coding method and device, electronic equipment and storage medium
CN115022725A (en) Video playing method and device
CN112738508A (en) Video coding method, video determining method, video processing method, server and VR terminal
CN110636332A (en) Video processing method and device and computer readable storage medium
CN104038778A (en) Multimedia redirection playing control method and device
CN115278366B (en) Data processing method and device for video stream of virtual machine and electronic equipment
CN116781973B (en) Video encoding and decoding method and device, storage medium and electronic equipment
US20230421779A1 (en) Decoding processing method and apparatus, computer device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination