CN111918098A - Video processing method and device, electronic equipment, server and storage medium - Google Patents

Video processing method and device, electronic equipment, server and storage medium Download PDF

Info

Publication number
CN111918098A
CN111918098A CN202010976364.9A CN202010976364A CN111918098A CN 111918098 A CN111918098 A CN 111918098A CN 202010976364 A CN202010976364 A CN 202010976364A CN 111918098 A CN111918098 A CN 111918098A
Authority
CN
China
Prior art keywords
video
frame
processed
frame interpolation
interpolation processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010976364.9A
Other languages
Chinese (zh)
Inventor
范泽华
陈江川
郑超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010976364.9A priority Critical patent/CN111918098A/en
Publication of CN111918098A publication Critical patent/CN111918098A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234381Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by altering the temporal resolution, e.g. decreasing the frame rate by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • H04N21/23614Multiplexing of additional data and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The application discloses a video processing method, a video processing device, electronic equipment and a storage medium, wherein the video processing method is applied to the electronic equipment and comprises the following steps: performing frame interpolation processing on a video to be processed; adding a specified identifier to the video to be processed after the frame interpolation processing to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to frame interpolation processing; and uploading the target video to a server, wherein the server is used for issuing the target video to the target equipment when receiving a video acquisition request which is sent by the target equipment and used for acquiring the target video. The method can realize the sharing of the video after the frame interpolation processing, and can identify the video after the frame interpolation, so that other equipment can directly play the video after the frame interpolation according to the identification.

Description

Video processing method and device, electronic equipment, server and storage medium
Technical Field
The present application relates to the field of video processing technologies, and in particular, to a video processing method and apparatus, an electronic device, a server, and a storage medium.
Background
With the rapid progress of the technology level and the living standard, electronic devices (such as smart phones, tablet computers, etc.) have become one of the electronic products commonly used in people's lives. People usually install video playing applications on electronic equipment to play videos, and some video playing applications can perform frame insertion processing on played videos in order to improve video playing fluency. However, in the conventional technology, generally, the electronic device performs only one-time frame interpolation on the video when playing the video, and cannot share the video after the frame interpolation.
Disclosure of Invention
In view of the foregoing problems, the present application provides a video processing method, an apparatus, an electronic device, a server, and a storage medium.
In a first aspect, an embodiment of the present application provides a video processing method, which is applied to an electronic device, and the method includes: performing frame interpolation processing on a video to be processed; adding a specified identifier to the video to be processed after the frame interpolation processing to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to frame interpolation processing; and uploading the target video to a server, wherein the server is used for issuing the target video to the target equipment when receiving a video acquisition request which is sent by the target equipment and used for acquiring the target video.
In a second aspect, an embodiment of the present application provides a video processing method, which is applied to a server, and the method includes: acquiring a video to be processed; performing frame interpolation processing on the video, adding a specified identifier to the video subjected to frame interpolation processing to obtain the video carrying the specified identifier, and storing the video carrying the specified identifier, wherein the specified identifier is used for indicating that the video is subjected to frame interpolation processing; receiving a video acquisition request sent by first equipment and used for acquiring the video; and responding to the video acquisition request, and issuing the video carrying the specified identification to the first equipment.
In a third aspect, an embodiment of the present application provides a video processing apparatus, which is applied to an electronic device, and the apparatus includes: the system comprises an interpolation frame processing module, an identification adding module and a video uploading module, wherein the interpolation frame processing module is used for performing interpolation frame processing on a video to be processed; the identification adding module is used for adding an appointed identification to the video to be processed after the frame interpolation processing to obtain a target video carrying the appointed identification, wherein the appointed identification is used for indicating that the target video is subjected to the frame interpolation processing; the video uploading module is used for uploading the target video to a server, and the server is used for issuing the target video to the target equipment when receiving a video acquiring request which is sent by the target equipment and used for acquiring the target video.
In a fourth aspect, an embodiment of the present application provides a video processing apparatus, which is applied to a server, and the apparatus includes: the system comprises a video acquisition module, a video frame insertion module, a request receiving module and a video issuing module, wherein the video acquisition module is used for acquiring a video to be processed; the video frame insertion module is used for performing frame insertion processing on the video, adding a specified identifier to the video subjected to frame insertion processing to obtain the video carrying the specified identifier, and storing the video carrying the specified identifier, wherein the specified identifier is used for indicating that the video is subjected to frame insertion processing; the request receiving module is used for receiving a video acquisition request which is sent by first equipment and used for acquiring the video; the video issuing module is configured to respond to the video acquisition request and issue the video carrying the specified identifier to the first device.
In a fifth aspect, an embodiment of the present application provides an electronic device, including: a frame insertion chip; one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and the framing chip, the applications configured to be executed by the processor and the framing chip, the one or more programs configured to perform the video processing method provided by the first aspect above.
In a sixth aspect, an embodiment of the present application provides a server, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the video processing method provided by the second aspect above.
In a seventh aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the video processing method provided in the first aspect or the second aspect.
According to the scheme, frame insertion processing is carried out on the video to be processed, designated identification is added to the video to be processed after the frame insertion processing, the target video with the designated identification is obtained, the designated identification is used for indicating that the target video is subjected to the frame insertion processing, the target video is uploaded to the server and used for being issued to the target device when a video obtaining request which is sent by the target device and used for obtaining the target video is received, and therefore sharing of the video after the frame insertion processing of the electronic device can be achieved, the designated identification is added to the video after the frame insertion, other devices can directly play the video after the frame insertion according to the designated identification in the target video after the target video is obtained from the server.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of an application scenario provided in an embodiment of the present application.
Fig. 2 shows another schematic diagram of an application scenario provided in an embodiment of the present application.
Fig. 3 shows a flow diagram of a video processing method according to an embodiment of the application.
Fig. 4 shows a flow diagram of a video processing method according to another embodiment of the present application.
Fig. 5 is a block diagram of an electronic device according to an embodiment of the present application, configured to execute a video processing method according to an embodiment of the present application.
Fig. 6 is a schematic diagram illustrating an operation principle of a frame interpolation chip according to an embodiment of the present application.
Fig. 7 shows another schematic diagram of an operation principle of a frame interpolation chip provided in an embodiment of the present application.
Fig. 8 shows a flow diagram of a video processing method according to yet another embodiment of the present application.
Fig. 9 shows a flow diagram of a video processing method according to yet another embodiment of the present application.
FIG. 10 shows a flow diagram of a video processing method according to yet another embodiment of the present application.
FIG. 11 shows a block diagram of a video processing apparatus according to an embodiment of the present application.
Fig. 12 shows a block diagram of a video processing apparatus according to another embodiment of the present application.
Fig. 13 is a block diagram of an electronic device according to an embodiment of the present application for executing a video processing method according to an embodiment of the present application.
Fig. 14 is a storage unit for storing or carrying program codes for implementing a video processing method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
With the development of electronic devices, the configuration and functions of electronic devices are more and more powerful, and the playing effect of electronic devices during video playing is better and better. In a related video playing scheme, the electronic device can perform frame interpolation on the video to improve the fluency of video playing.
However, after studying a frame interpolation scheme of a video for a long time, the inventor finds that, in a normal video playing process, after the electronic device acquires a video to be played, the video is subjected to frame interpolation processing and output to a display screen for display, and the video subjected to frame interpolation processing is not stored and shared.
In view of the above problems, the inventor proposes a video processing method, an apparatus, an electronic device, a server, and a storage medium provided in this embodiment of the present application, which can implement sharing of a video after frame insertion processing of the electronic device, and add a specific identifier to the video after frame insertion, so that after other devices acquire a target video from the server, the video after frame insertion can be directly played according to the specific identifier in the target video. The specific video processing method is described in detail in the following embodiments.
An application environment of the video processing method provided by the embodiment of the present application is described below.
Referring to fig. 1, fig. 1 shows a network structure diagram of an application scenario of the present application, where an electronic device 100 is communicatively connected to a first server 210, and the electronic device 110 may perform frame interpolation on a video, add a specific identifier to the video after the frame interpolation, where the specific identifier is used to indicate that the video has been subjected to frame interpolation, and upload the video to the first server 210. The first server 210 may respond to a video acquisition request of another device, and issue the video to the other device, and when the other device plays the video, it recognizes that the video carries the specified identifier, and may not perform frame insertion processing any more.
In some ways, referring to fig. 2, an application scenario may further include a second server 220, and the second server 220 may perform data interaction with the first server 210 and the electronic device 100. In one scenario, the second server 220 may serve as a video provider, the electronic device 100 may process the frame insertion of the video after obtaining the video from the video provider, and then may upload the video after frame insertion to the first server 210, so as to implement sharing of the video after frame insertion.
The electronic device 100 may be a mobile phone, a tablet computer, or the like; the first server 210 and the second server 220 may be a physical or logical server, etc. In the embodiment of the present application, the device types of the first server 210, the second server 220 and the electronic device 100, and the types, protocols, and the like of the communication networks between the electronic device 100 and the first server 210, between the electronic device 100 and the second server 220, and between the first server 210 and the second server 220 are not limited.
The following describes a video processing method according to an embodiment of the present application in detail.
Referring to fig. 3, fig. 3 is a flowchart illustrating a video processing method according to an embodiment of the present application. In a specific embodiment, the video processing method is applied to the video processing apparatus 400 shown in fig. 11 and the electronic device 100 (fig. 5) equipped with the video processing apparatus 400. The following will describe a specific flow of the embodiment by taking an electronic device as an example, and it is understood that the electronic device applied in the embodiment may be a smart phone, a tablet computer, a notebook computer, and the like, which is not limited herein. As will be described in detail with respect to the flow shown in fig. 3, the video processing method may specifically include the following steps:
step S110: and performing frame interpolation processing on the video to be processed.
In the embodiment of the application, the electronic device can perform frame interpolation processing on the video to be processed. The video to be processed may be a played video, a video shot during shooting, a locally stored video, a video recorded during a video conference, or the like, and is not limited herein.
In some embodiments, the electronic device may perform frame interpolation on the played to-be-processed video when the to-be-processed video is played. The video playing application may be a system application or a third-party application, and is not limited herein; in other embodiments, the electronic device may perform video shooting in response to a video shooting instruction, and perform frame interpolation processing on a video obtained by the video shooting; in still other embodiments, the electronic device may further perform frame interpolation on the video to be processed when the operating state of the processor satisfies a preset state, for example, perform frame interpolation on the video to be processed through a background process when the processor is in an idle state.
In some embodiments, an electronic device may include a processor and a framing chip. The frame insertion chip can be connected with the processor, and the frame insertion chip can acquire video data from the processor, perform frame insertion processing on the video data and output the video data to the display screen.
The frame insertion chip is used for adding one frame or a plurality of frames in every two frames of pictures displayed by the original pictures, so that the display time between every two frames is shortened, the frame rate of screen display of the electronic equipment is improved, the problems of flicker and tailing of the electronic equipment are solved, the image edge blurring phenomenon of the fast moving pictures is eliminated, and the illusion formed by the visual persistence of human eyes is corrected, thereby effectively improving the picture stability.
In the above embodiment, the interpolation of the video to be processed is executed by the set interpolation chip, so that the load of the processor can be effectively reduced, the electronic device can perform interpolation processing on the video without increasing the configuration of the processor, and the playing effect of the video picture is improved.
In other embodiments, the electronic device may also complete frame insertion of the video data by a Graphics Processing Unit (GPU), where the GPU may be connected to a Central Processing Unit (CPU) of the electronic device, and when performing frame insertion Processing on the video to be processed, the CPU may transmit the video data to the CPU after obtaining the played video data, and the CPU returns the video data to the CPU after completing the frame insertion Processing.
Step S120: and adding a specified identifier to the video to be processed after the frame interpolation processing to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to the frame interpolation processing.
In this embodiment of the application, after the electronic device processes a video frame interpolation, a specific identifier may be added to a to-be-processed video after the frame interpolation, where the specific identifier is used to indicate that the video has been subjected to frame interpolation processing. After the video subjected to the frame interpolation processing is added with the designated identification, the obtained video can be used as the target video.
In some embodiments, the electronic device may add a specific identifier to a file name of the video to be processed after the frame insertion, where the specific identifier may be a keyword such as "inserted frame", so that the server or other devices may determine that the target video has been subjected to frame insertion based on the file name of the target video; in other embodiments, the electronic device may add a sound or image tag to the video to be processed after the frame insertion to achieve the addition of the specific identifier. Of course, the specific manner of adding the specific identifier to the video to be processed after the frame interpolation processing may not be limited.
In some embodiments, in the case that the electronic device uses the processor to perform frame interpolation on the video, the to-be-processed video after the frame interpolation may be stored in the memory by the processor, and then a specific identifier may be added to the to-be-processed video after the frame interpolation; when the electronic device inserts the frame into the video by using the frame insertion chip, the video to be processed after the frame insertion can be returned to the processor by the frame insertion chip, and the processor can add a designated identifier to the video to be processed after the frame insertion.
Step S130: and uploading the target video to a server, wherein the server is used for issuing the target video to the target equipment when receiving a video acquisition request which is sent by the target equipment and used for acquiring the target video.
In the embodiment of the application, the electronic device can add the designated identifier to the video subjected to the frame interpolation processing, and can upload the target video to the server after the target video is obtained, so that the video subjected to the frame interpolation can be shared. The server can respond to the video acquisition requests of other equipment and issue the target video to the other equipment, and the target video carries the designated identifier, so that the fact that the target video is subjected to frame interpolation processing can be effectively identified, the frame interpolation processing is not required when the target video is played, and the playing efficiency is improved.
According to the video processing method provided by the embodiment of the application, frame insertion processing is performed on a video to be processed, then an appointed identification is added to the video to be processed after the frame insertion processing, a target video with the appointed identification is obtained, the appointed identification is used for indicating that the target video is subjected to the frame insertion processing, the target video is uploaded to a server and used for issuing the target video to a target device when a video obtaining request which is sent by the target device and used for obtaining the target video is received, and therefore sharing of the video after the frame insertion processing of the electronic device can be achieved, the appointed identification is added to the video after the frame insertion, and therefore the video after the frame insertion can be directly played according to the appointed identification in the target video after other devices obtain the target video from the server.
Referring to fig. 4, fig. 4 is a flowchart illustrating a video processing method according to another embodiment of the present application. Referring to fig. 5, the electronic device 100 includes a Processor 110 and a frame insertion chip 130, wherein the Processor 110 may be an Application Processor (AP), i.e., a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), and the like. The frame insertion chip is used for adding one frame or a plurality of frames in every two frames of pictures displayed by the original pictures, shortening the display time between every two frames, improving the frame rate of screen display of the electronic equipment, solving the problems of flicker and trailing of the electronic equipment, eliminating the image edge blurring phenomenon of the fast moving pictures, and correcting the illusion formed by the visual persistence of human eyes, thereby effectively improving the picture stability.
As will be described in detail with respect to the flow shown in fig. 4, the video processing method may specifically include the following steps:
step S210: and transmitting the video to be processed to the frame insertion chip through the application program processor.
In the embodiment of the application, when the video to be processed is subjected to the frame interpolation processing, the application processor may transmit the video data of the video to be processed to the frame interpolation chip after obtaining the video to be processed to be subjected to the frame interpolation processing, so that the frame interpolation chip performs the frame interpolation processing on the video to be processed to be subjected to the frame interpolation processing.
Step S220: and performing frame interpolation processing on the video to be processed through the frame interpolation chip.
Step S230: and adding a specified identifier to the video to be processed after the frame interpolation processing through the frame interpolation chip to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to the frame interpolation processing.
Step S240: and transmitting the target video to the application program processor through the frame insertion chip.
In the embodiment of the application, after the frame interpolation is performed on the video to be processed, the frame interpolation chip can add the specified identifier to the video to be processed after the frame interpolation processing, so as to obtain the target video carrying the specified identifier. In some embodiments, because the storage capacity of the storage unit of the frame interpolation chip is limited and each frame of image of the video cannot be completely stored, when the frame interpolation chip outputs one frame of image to the display screen, a designated identifier is added to the frame of image, then each frame of image is transmitted to the application program processor, and a plurality of frames of images carrying the designated identifier received by the application program processor can form a complete target video.
In some embodiments, referring to fig. 6, the frame interpolation chip includes a frame interpolation processing unit 136, an output control unit 135, a storage control unit 133, and a storage unit 134. The frame interpolation processing unit 136 is configured to perform frame interpolation processing on the video and add a specified identifier to the video after the frame interpolation processing, so as to obtain a target video; the output control unit 135 is used for controlling the output of data in the frame insertion chip, for example, to a display screen or the like; the storage unit 135 is used for storing data in the process of processing the input video data by the frame insertion chip; the storage control unit 133 is used to implement storage control, and can implement reading and writing of data from the storage unit 134. Of course, other units, such as the input control unit 131, the compression processing unit 132, and the like shown in fig. 6, are also included in the frame interpolation chip 130.
In the frame interpolation chip shown in fig. 6, since only the compression processing unit 132 is connected to the storage control unit, the output control unit 133 cannot retrieve image data of a video after frame interpolation. Therefore, referring to fig. 7, the frame interpolation chip 130 may be modified to add a path between the output control unit 135 and the storage control unit, so that the frame interpolation processing unit 136 may store each frame image of the video after frame interpolation in the storage unit 134, and then the output control unit 135 reads each frame image from the storage unit 134 by using the added path. Specifically, each frame image of the target video is transmitted to the storage control unit 133 through the frame interpolation processing unit 136; storing each frame image of the target video to the storage unit 134 through the storage control unit 133; each frame image of the target video is read from the storage unit through the output control unit 135 and transmitted to the processor 110. Therefore, through the mode, the frame insertion chip can add the designated identification to the video subjected to frame insertion and transmit the designated identification back to the application program processor, and the application program processor can subsequently upload the target video to the server through the corresponding network service interface.
Step S250: uploading, by the application processor, the target video to a server.
According to the video processing method provided by the embodiment of the application, when the video to be processed is subjected to frame insertion processing through the frame insertion chip, the frame insertion chip identifies the video to be processed after frame insertion and transmits the video back to the application program processor, and the application program processor uploads the target video to the server through the corresponding network service interface, so that the video after frame insertion processing of the electronic equipment can be shared, the specified identification is added to the video after frame insertion, other equipment can acquire the target video from the server, and the video after frame insertion can be directly played according to the specified identification in the target video.
Referring to fig. 8, fig. 8 is a flow chart illustrating a video processing method according to another embodiment of the present application. The video processing method is applied to the electronic device, and will be described in detail with respect to the flow shown in fig. 8, and the video processing method may specifically include the following steps:
step S310: and acquiring the video for playing from the server as the video to be processed.
In this embodiment of the application, the frame interpolation performed by the electronic device on the video to be processed may be a scene used for video playing, and first, the electronic device may obtain the video for playing. Specifically, the electronic device may obtain a video for playback from a server. The server may be a server to which the target video is uploaded, and is configured to manage the video, and in the stored video, the video subjected to the frame insertion processing may carry a specified identifier.
Step S320: and judging whether the video to be processed carries the specified identification.
In the embodiment of the application, after the electronic device acquires the video to be processed from the server, whether the video to be processed carries the specified identifier or not can be judged. If the video to be processed carries the designated identification, the video to be processed is indicated to be subjected to frame interpolation processing; and if the video to be processed does not carry the specified identifier, indicating that the video to be processed does not carry out frame interpolation processing.
Step S330: and when the video to be processed does not carry the designated identification, performing frame insertion processing on the video in the process of playing the video.
In this embodiment of the application, when the electronic device determines that the to-be-processed video does not carry the designated identifier, it indicates that the to-be-processed video is not subjected to frame interpolation processing.
Step S340: and when the video to be processed carries the specified identification, acquiring the frame rate of the video.
Step S350: and when the frame rate is lower than a preset frame rate, performing frame interpolation processing on the video to be processed in the process of playing the video to be processed, wherein the frame rate of the video subjected to frame interpolation processing is greater than or equal to the preset frame rate.
In this embodiment of the application, when the electronic device determines that the video to be processed carries the specified identifier, it indicates that the video to be processed is subjected to frame interpolation processing, and in this case, the electronic device may determine whether further frame interpolation processing needs to be performed on the video to be processed, so as to improve the frame rate of the video to be processed.
In this embodiment of the application, when it is determined that the video to be processed carries the specified identifier, the electronic device may obtain a frame rate of the video to be processed, compare the frame rate with a preset frame rate, and if the frame rate is greater than or equal to the preset frame rate, it indicates that the frame rate of the video to be processed is higher, that is, the video to be processed is good in fluency, and therefore, it is not necessary to perform frame interpolation again, which causes resource waste; if the frame rate is the first preset frame rate, the electronic device can perform frame interpolation processing on the video to be processed so as to further improve the frame rate of the video to be processed, so that the frame rate reaches the preset frame rate, and the playing effect of the video to be processed is further improved. The preset frame rate may be determined according to a highest frame rate that the electronic device can achieve by frame interpolation, and specifically, the preset frame rate may be less than the highest frame rate.
In some embodiments, when the electronic device performs video playing, in the process of inserting frames into a video to be processed, a change of a video playing interface may be detected, and when it is detected that the video playing interface is displaced in the same direction, for example, a playing area of a current video slides in a direction, and a playing area of a next video is gradually exposed until the video playing interface is displayed in a middle area, in this case, the frame insertion processing may be suspended, so as to avoid that, when the video playing interface is displaced, a frame rate of a dynamic change effect of the displacement is rendered at the same time, and the frame rate is inconsistent with a video frame rate after the frame insertion, and display abnormalities such as edge cracks, shadows, and jams occur. Specifically, the electronic device may detect a motion vector of a preset detection point on an edge of the video playing area, where the motion vector of the preset detection point represents a displacement of a pixel point, and a direction and a magnitude of the motion vector may reflect a direction and a speed of the displacement of the pixel point. The preset detection point may be a pixel point on the edge of the video playing area, and the specific position thereof may not be defined. It can be understood that, when the above video playing application performs video switching, the video area moves, so that other areas in the interface also change until the video playing area corresponding to the next video is completely exposed and displayed to the corresponding position in the interface. When the video area moves, the edge is representative, that is, the video playing area moves, and the boundary of the video playing area must move in the same direction, so that the pixel point on the edge in the video playing area can be used as a preset detection point. The number of the preset detection points may not be limited, and may be at least two, for example, 2, or 3. If the motion vectors of each preset detection point are the same, the video playing area is indicated to move towards the same direction, namely the display condition of the interface change effect occurs during video switching, and at the moment, the frame insertion processing can be suspended, so that abnormal display is avoided.
Step S360: and adding a specified identifier to the video to be processed after the frame interpolation processing to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to the frame interpolation processing.
Step S370: and uploading the to-be-processed video subjected to the frame interpolation to a server, wherein the server is used for issuing the target video to the target equipment when receiving a video acquisition request which is sent by the target equipment and used for acquiring the target video.
In the embodiment of the present application, step S360 and step S370 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S380: and when the frame rate is not lower than the preset frame rate, not performing frame interpolation processing on the video to be processed.
In the embodiment of the application, when it is determined that the frame rate of the to-be-processed video for playing is not lower than the preset frame rate, the electronic device may not perform frame interpolation on the to-be-processed video, and in this case, the obtained to-be-processed video may be directly played.
In some embodiments, the target video after the frame interpolation processing by the electronic device may be stored locally, that is, in the memory, so that when the electronic device plays the target video again in the following, it may be determined that the target video has been subjected to the frame interpolation processing based on the specified identifier carried in the target video, and the frame interpolation processing is not required any more, and the calculation amount is reduced based on direct playing.
According to the video processing method provided by the embodiment of the application, a video used for playing is obtained from a server and is used as a video to be processed, whether the video to be processed carries an appointed identification or not is determined, frame insertion processing is carried out on the video to be processed under the condition that the video to be processed does not carry the video identification or the video to be processed carries the appointed identification but the frame rate is lower than the appointed frame rate, the appointed identification is added to the video after frame insertion, a target video carrying the appointed identification is obtained, the target video is uploaded to the server, sharing of the video after frame insertion processing of electronic equipment can be achieved, the appointed identification is added to the video after frame insertion, and after other equipment obtains the target video from the server, the video after frame insertion can be directly played according to the appointed identification in the target video.
Referring to fig. 9, fig. 9 is a schematic flowchart illustrating a video processing method according to still another embodiment of the present application. The video processing method is applied to the server, and will be described in detail with respect to the flow shown in fig. 9, and the video processing method may specifically include the following steps:
step S410: and acquiring a video to be processed.
In the embodiment of the application, the server may obtain the video uploaded by other devices, or obtain the video from other servers to obtain the video to be processed.
Step S420: and performing frame interpolation processing on the video, adding a specified identifier to the video subjected to frame interpolation processing to obtain the video carrying the specified identifier, and storing the video carrying the specified identifier, wherein the specified identifier is used for indicating that the video is subjected to frame interpolation processing.
In the embodiment of the application, a frame interpolation algorithm can be preset in the server, the server can perform frame interpolation processing on the videos, and a specified identifier is added to the videos subjected to frame interpolation processing. The manner in which the server adds the specific identifier to the video after the frame interpolation processing may refer to the manner in which the electronic device adds the specific identifier to the video in the foregoing embodiment, which is not described herein again.
Step S430: and receiving a video acquisition request sent by the first equipment for acquiring the video.
Step S440: and responding to the video acquisition request, and issuing the video carrying the specified identification to the first equipment.
In the embodiment of the application, after the electronic device stores the video, the electronic device may further respond to a video acquisition request of other devices, for example, the first device, and issue the video to the first device, so that the first device determines that the video has been subjected to frame interpolation processing based on the specified identifier carried in the video, and does not need to perform frame interpolation during playing, thereby reducing the processing amount.
The method for processing the frame interpolation provided by the embodiment of the application can realize the frame interpolation processing of the video by the server, manage the video after the frame interpolation, and send the video to other equipment by the server, thereby realizing the sharing of the video after the frame interpolation and improving the user experience.
Referring to fig. 10, fig. 10 is a schematic flow chart illustrating a video processing method according to another embodiment of the present application. The video processing method is applied to the server, and will be described in detail with respect to the flow shown in fig. 10, and the video processing method may specifically include the following steps:
step S510: and receiving the video uploaded by the second equipment to obtain the video to be processed.
Step S520: and judging whether the video carries the specified identification.
Step S530: if the video does not carry the specified identification, performing frame interpolation processing on the video, adding the specified identification to the video subjected to frame interpolation processing to obtain the video carrying the specified identification, and storing the video subjected to frame interpolation processing.
In this embodiment of the application, if the video received by the server does not carry the designated identifier, it indicates that the video has not been subjected to frame interpolation processing before, so that the server can perform frame interpolation processing on the received video.
Step S540: and if the video carries the specified identification, acquiring the video frame rate of the video.
Step S550: and when the video frame rate is lower than the designated frame rate, performing frame interpolation processing on the video, and storing the video subjected to frame interpolation processing so as to enable the video frame rate of the video to be greater than or equal to the designated frame rate.
In this embodiment of the application, when the frame rate of the video received by the server is lower than the specified frame rate, the frame interpolation processing may be further performed on the video, so as to further improve the frame rate of the video, so that the frame rate reaches the specified frame rate, and further improve the playing effect of the video. The specified frame rate may be determined according to a highest frame rate that can be achieved by the server through the frame interpolation, and specifically, the specified frame rate may be smaller than the highest frame rate.
Step S560: and receiving a video acquisition request sent by the first equipment for acquiring the video.
Step S570: and responding to the video acquisition request, and issuing an identity verification instruction to the first equipment.
Step S580: and receiving identity information sent by the first equipment based on the identity verification instruction.
Step S590: and when the identity information is matched with preset identity information, the video carrying the specified identification is transmitted to the first equipment.
In the embodiment of the application, when other devices acquire the video from the server, the server can also verify the identity of the other devices, and if the identity of the other devices is the device with the authority, the video can be issued. The preset identity information may be identity information of a device having an authority, and the identity information may be a device identifier, which is not limited herein. Through the setting of the preset identity information, the server can be accessed only by equipment with the sharing requirement of the video processed by the frame insertion.
In some embodiments, the verification of the identity information may be key verification, password verification, face verification, and the like, which is not limited herein.
The video processing method provided by the embodiment of the application can realize the management of the server on the video after the frame insertion, and the server not only can receive the video after the frame insertion processing of the electronic equipment, but also can issue the video to other equipment, so that the sharing of the video after the frame insertion of the user is realized, and the user experience is improved. In addition, for videos which do not reach the specified frame rate after frame insertion uploaded by the device, the server can further perform frame insertion processing according to a built-in algorithm so as to further improve the video frame rate, and therefore the shared videos are smoother.
Referring to fig. 11, a block diagram of a video processing apparatus 400 according to an embodiment of the present application is shown. The video processing apparatus 400 applies the above-mentioned electronic device, and the video processing apparatus 400 includes: an inter-frame processing module 410, an identity adding module 420, and a video uploading module 430. The frame interpolation processing module 410 is configured to perform frame interpolation processing on a video to be processed; the identifier adding module 420 is configured to add an assigned identifier to the video to be processed after the frame interpolation processing, so as to obtain a target video carrying the assigned identifier, where the assigned identifier is used to indicate that the target video has been subjected to frame interpolation processing; the video uploading module 430 is configured to upload the target video to a server, where the server is configured to issue the target video to a target device when receiving a video acquisition request sent by the target device and used for acquiring the target video.
In some embodiments, the electronic device includes an application processor and a framing chip. The frame interpolation processing module 410 may transmit the video to be processed to the frame interpolation chip through the application processor; and performing frame interpolation processing on the video to be processed through the frame interpolation chip.
In this embodiment, the identifier adding module 420 may add an assigned identifier to the video after the frame interpolation processing through the frame interpolation chip, so as to obtain a target video carrying the assigned identifier. The video upload module 430 may be specifically configured to transmit the target video to the application processor via the frame insertion chip; uploading, by the application processor, the target video to a server.
Further, the frame interpolation chip comprises a frame interpolation processing unit, an output control unit, a storage control unit and a storage unit, wherein the frame interpolation processing unit is used for performing frame interpolation processing on the video to be processed and adding a specified identifier to the video to be processed after the frame interpolation processing, so as to obtain a target video. The frame insertion processing unit is used for transmitting the target video to the storage control unit; the storage control unit is used for storing the target video to the storage unit; the output control unit is used for reading the target video from the storage unit and transmitting the target video to the application program processor.
In some embodiments, the video processing apparatus 400 may further include a video acquisition module and an identification determination module. The video processing module is used for acquiring a video for playing from the server as a video to be processed before the video is subjected to frame interpolation processing; and the identification judgment module is used for judging whether the video to be processed carries the specified identification. The frame interpolation processing module 410 is configured to perform frame interpolation processing on the video to be processed in the process of playing the video to be processed when the video to be processed does not carry the specified identifier.
In this embodiment, the frame interpolation processing module 410 is further configured to obtain a frame rate of the video to be processed when the video to be processed carries the specified identifier; and when the frame rate is lower than a preset frame rate, performing frame interpolation processing on the video to be processed in the process of playing the video to be processed, wherein the frame rate of the video to be processed after the frame interpolation processing is greater than or equal to the preset frame rate.
In some embodiments, the frame insertion processing module 410 may be specifically configured to: responding to a video shooting instruction, and carrying out video shooting; and performing frame interpolation processing on the video obtained by shooting.
Referring to fig. 12, a block diagram of a video processing apparatus 500 according to an embodiment of the present application is shown. The video processing apparatus 500 employs the server described above, and the video processing apparatus 500 includes: a video acquisition module 510, a video frame insertion module 520, a request receiving module 530, and a video delivery module 540. The video obtaining module 510 is configured to obtain a video to be processed; the video frame interpolation module 520 is configured to perform frame interpolation on the video, add a specific identifier to the video subjected to frame interpolation, obtain a video with the specific identifier, and store the video with the specific identifier, where the specific identifier is used to indicate that the video has been subjected to frame interpolation; the request receiving module 530 is configured to receive a video obtaining request sent by a first device for obtaining the video; the video issuing module 540 is configured to respond to the video obtaining request, and issue the video carrying the specified identifier to the first device.
In some embodiments, the video obtaining module 510 may be specifically configured to receive a video uploaded by a second device, and obtain a video to be processed; the video frame insertion module 520 may be specifically configured to: judging whether the video carries the specified identification or not; and if the video does not carry the specified identification, performing frame interpolation processing on the video, adding the specified identification to the video subjected to frame interpolation processing, and obtaining the video carrying the specified identification.
In this embodiment, the video obtaining module 510 may be further configured to: if the video carries the designated identification, acquiring a video frame rate of the video; and when the video frame rate is lower than the designated frame rate, performing frame interpolation processing on the video, and storing the video subjected to frame interpolation processing so as to enable the video frame rate of the video to be greater than or equal to the designated frame rate.
In some embodiments, the video downloading module 540 may be specifically configured to: responding to the video acquisition request, and issuing an identity verification instruction to the first equipment; receiving identity information sent by the first equipment based on the identity verification instruction; and when the identity information is matched with preset identity information, the video carrying the specified identification is transmitted to the first equipment.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, according to the scheme provided by the application, frame insertion processing is performed on a video to be processed, then an appointed identification is added to the video to be processed after the frame insertion processing, a target video with the appointed identification is obtained, the appointed identification is used for indicating that the target video is subjected to the frame insertion processing, the target video is uploaded to a server and used for issuing the target video to a target device when a video obtaining request which is sent by the target device and used for obtaining the target video is received, and therefore sharing of the video after the frame insertion processing of the electronic device can be achieved, the appointed identification is added to the video after the frame insertion, and therefore the video after the frame insertion can be directly played according to the appointed identification in the target video after other devices obtain the target video from the server.
Referring to fig. 5 again, a block diagram of an electronic device according to an embodiment of the present disclosure is shown. The electronic device 100 may be a smart phone, a tablet computer, a smart watch, a notebook computer, or other electronic devices capable of running an application program. The electronic device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, a framing chip 130, a display screen 140, and one or more applications, wherein the one or more applications may be stored in the memory 120 and the framing chip 130 and configured to be executed by the one or more processors 110 and the framing chip 130, the one or more programs configured to perform the method as described in the foregoing method embodiments.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the overall electronic device 100 using various interfaces and lines, and performs various functions of the electronic device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The data storage area may also store data created by the electronic device 100 during use (e.g., phone book, audio-video data, chat log data), and the like.
The frame interpolation chip 130 may be a chip for independently performing frame interpolation processing on a video, and is configured to add one or more frames to every two frames displayed on an original frame, so as to shorten a display time between each frame, improve a frame rate of screen display of the electronic device, solve problems of flickering and tailing of the electronic device, eliminate an image edge blurring phenomenon of a fast moving frame, and correct an illusion formed by human visual persistence, thereby effectively improving frame stability.
The Display screen 140 is used for displaying information input by a user, information provided to the user, and various graphical user interfaces of the electronic device 100, which may be formed by graphics, text, icons, numbers, videos, and any combination thereof, and in one example, the Display screen 130 may be a Liquid Crystal Display (LCD) or an Organic Light-Emitting Diode (OLED), which is not limited herein.
Referring to fig. 13, a block diagram of a server according to an embodiment of the present application is shown. The server 200 may be a conventional server, a cloud server, or the like capable of running an application. The server 200 in the present application may include one or more of the following components: a processor 210, a memory 220, and one or more applications, wherein the one or more applications may be stored in the memory 220 and configured to be executed by the one or more processors 210, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 210 may include one or more processing cores. Processor 110 interfaces with various components throughout server 200 using various interfaces and lines to perform various functions of server 200 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in memory 220 and invoking data stored in memory 220. Alternatively, the processor 210 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 210 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 210, but may be implemented by a communication chip.
The Memory 220 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 220 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 220 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the server 200 in use (such as phone books, audio and video data, chat log data), and the like.
Referring to fig. 14, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (16)

1. A video processing method applied to an electronic device, the method comprising:
performing frame interpolation processing on a video to be processed;
adding a specified identifier to the video to be processed after the frame interpolation processing to obtain a target video carrying the specified identifier, wherein the specified identifier is used for indicating that the target video is subjected to frame interpolation processing;
and uploading the target video to a server, wherein the server is used for issuing the target video to the target equipment when receiving a video acquisition request which is sent by the target equipment and used for acquiring the target video.
2. The method of claim 1, wherein the electronic device comprises an application processor and a frame interpolation chip, and the frame interpolation processing of the video to be processed comprises:
transmitting a video to be processed to the frame interpolation chip through the application processor;
and performing frame interpolation processing on the video to be processed through the frame interpolation chip.
3. The method according to claim 2, wherein the adding a specific identifier to the video after the frame interpolation processing comprises:
adding a specified identifier to the video subjected to frame interpolation processing through the frame interpolation chip to obtain a target video carrying the specified identifier;
the uploading the target video to a server includes:
transmitting the target video to the application processor through the frame insertion chip;
uploading, by the application processor, the target video to a server.
4. The method according to claim 3, wherein the frame interpolation chip comprises a frame interpolation processing unit, an output control unit, a storage control unit and a storage unit, wherein the frame interpolation processing unit is used for performing frame interpolation processing on the video to be processed and adding a specified identifier to the video subjected to frame interpolation processing to obtain a target video;
transmitting the target video to the application processor through the frame insertion chip, comprising:
transmitting the target video to the storage control unit through the frame insertion processing unit;
storing the target video to the storage unit through the storage control unit;
reading the target video from the storage unit through the output control unit, and transmitting the target video to the application processor.
5. The method of claim 1, wherein before the frame interpolation processing of the video to be processed, the method further comprises:
acquiring a video for playing from the server as a video to be processed;
judging whether the video to be processed carries the specified identification or not;
the frame interpolation processing of the video to be processed comprises the following steps:
and when the video to be processed does not carry the designated identification, performing frame insertion processing on the video to be processed in the process of playing the video to be processed.
6. The method of claim 5, wherein the frame interpolation process is performed on the video to be processed, further comprising:
when the video to be processed carries the designated identification, acquiring the frame rate of the video to be processed;
and when the frame rate is lower than a preset frame rate, performing frame interpolation processing on the video to be processed in the process of playing the video to be processed, wherein the frame rate of the video to be processed after the frame interpolation processing is greater than or equal to the preset frame rate.
7. The method of claim 1, wherein the frame interpolation processing on the video to be processed comprises:
responding to a video shooting instruction, and carrying out video shooting;
and performing frame interpolation processing on the video obtained by shooting.
8. A video processing method applied to a server, the method comprising:
acquiring a video to be processed;
performing frame interpolation processing on the video, adding a specified identifier to the video subjected to frame interpolation processing to obtain the video carrying the specified identifier, and storing the video carrying the specified identifier, wherein the specified identifier is used for indicating that the video is subjected to frame interpolation processing;
receiving a video acquisition request sent by first equipment and used for acquiring the video;
and responding to the video acquisition request, and issuing the video carrying the specified identification to the first equipment.
9. The method of claim 8, wherein the obtaining the video to be processed comprises:
receiving a video uploaded by second equipment to obtain a video to be processed;
the frame interpolation processing is performed on the video, and a specified identifier is added to the video after the frame interpolation processing, so as to obtain the video carrying the specified identifier, and the method comprises the following steps:
judging whether the video carries the specified identification or not;
and if the video does not carry the specified identification, performing frame interpolation processing on the video, adding the specified identification to the video subjected to frame interpolation processing, and obtaining the video carrying the specified identification.
10. The method of claim 9, further comprising:
if the video carries the designated identification, acquiring a video frame rate of the video;
and when the video frame rate is lower than the designated frame rate, performing frame interpolation processing on the video, and storing the video subjected to frame interpolation processing so as to enable the video frame rate of the video to be greater than or equal to the designated frame rate.
11. The method of claim 8, wherein the sending the video carrying the specified identifier to the first device in response to the video acquisition request comprises:
responding to the video acquisition request, and issuing an identity verification instruction to the first equipment;
receiving identity information sent by the first equipment based on the identity verification instruction;
and when the identity information is matched with preset identity information, the video carrying the specified identification is transmitted to the first equipment.
12. A video processing apparatus, applied to an electronic device, the apparatus comprising: an interpolation frame processing module, an identification adding module and a video uploading module, wherein,
the frame interpolation processing module is used for performing frame interpolation processing on the video to be processed;
the identification adding module is used for adding an appointed identification to the video to be processed after the frame interpolation processing to obtain a target video carrying the appointed identification, wherein the appointed identification is used for indicating that the target video is subjected to the frame interpolation processing;
the video uploading module is used for uploading the target video to a server, and the server is used for issuing the target video to the target equipment when receiving a video acquiring request which is sent by the target equipment and used for acquiring the target video.
13. A video processing apparatus applied to a server, the apparatus comprising: a video acquisition module, a video frame insertion module, a request receiving module and a video sending module, wherein,
the video acquisition module is used for acquiring a video to be processed;
the video frame insertion module is used for performing frame insertion processing on the video, adding a specified identifier to the video subjected to frame insertion processing to obtain the video carrying the specified identifier, and storing the video carrying the specified identifier, wherein the specified identifier is used for indicating that the video is subjected to frame insertion processing;
the request receiving module is used for receiving a video acquisition request which is sent by first equipment and used for acquiring the video;
the video issuing module is configured to respond to the video acquisition request and issue the video carrying the specified identifier to the first device.
14. An electronic device, comprising:
a frame insertion chip;
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and the framing chip, the applications configured to be executed by the processor and the framing chip, the one or more programs configured to perform the method of any of claims 1-7.
15. A server, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 8-11.
16. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 11.
CN202010976364.9A 2020-09-16 2020-09-16 Video processing method and device, electronic equipment, server and storage medium Pending CN111918098A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010976364.9A CN111918098A (en) 2020-09-16 2020-09-16 Video processing method and device, electronic equipment, server and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010976364.9A CN111918098A (en) 2020-09-16 2020-09-16 Video processing method and device, electronic equipment, server and storage medium

Publications (1)

Publication Number Publication Date
CN111918098A true CN111918098A (en) 2020-11-10

Family

ID=73266912

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010976364.9A Pending CN111918098A (en) 2020-09-16 2020-09-16 Video processing method and device, electronic equipment, server and storage medium

Country Status (1)

Country Link
CN (1) CN111918098A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114285956A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video sharing circuit, method and device and electronic equipment
CN114285978A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video processing method, video processing device and electronic equipment
CN116886961A (en) * 2023-09-06 2023-10-13 中移(杭州)信息技术有限公司 Distributed live video frame inserting method, device, system and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101977218A (en) * 2010-10-20 2011-02-16 深圳市融创天下科技发展有限公司 Internet playing file transcoding method and system
CN105049914A (en) * 2015-07-07 2015-11-11 深圳Tcl数字技术有限公司 Picture frame playing method and device
CN105959717A (en) * 2016-05-27 2016-09-21 天脉聚源(北京)传媒科技有限公司 Live broadcast method based on mobile terminal and live broadcast device
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
CN110636375A (en) * 2019-11-11 2019-12-31 RealMe重庆移动通信有限公司 Video stream processing method and device, terminal equipment and computer readable storage medium
CN110933496A (en) * 2019-12-10 2020-03-27 Oppo广东移动通信有限公司 Image data frame insertion processing method and device, electronic equipment and storage medium
CN111093094A (en) * 2019-12-03 2020-05-01 深圳市万佳安物联科技股份有限公司 Video transcoding method, device and system, electronic equipment and readable storage medium
CN111225150A (en) * 2020-01-20 2020-06-02 Oppo广东移动通信有限公司 Method for processing interpolation frame and related product
WO2020173394A1 (en) * 2019-02-28 2020-09-03 华为技术有限公司 Recording frame rate control method and related apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101977218A (en) * 2010-10-20 2011-02-16 深圳市融创天下科技发展有限公司 Internet playing file transcoding method and system
CN105049914A (en) * 2015-07-07 2015-11-11 深圳Tcl数字技术有限公司 Picture frame playing method and device
CN105959717A (en) * 2016-05-27 2016-09-21 天脉聚源(北京)传媒科技有限公司 Live broadcast method based on mobile terminal and live broadcast device
CN109640168A (en) * 2018-11-27 2019-04-16 Oppo广东移动通信有限公司 Method for processing video frequency, device, electronic equipment and computer-readable medium
WO2020173394A1 (en) * 2019-02-28 2020-09-03 华为技术有限公司 Recording frame rate control method and related apparatus
CN110636375A (en) * 2019-11-11 2019-12-31 RealMe重庆移动通信有限公司 Video stream processing method and device, terminal equipment and computer readable storage medium
CN111093094A (en) * 2019-12-03 2020-05-01 深圳市万佳安物联科技股份有限公司 Video transcoding method, device and system, electronic equipment and readable storage medium
CN110933496A (en) * 2019-12-10 2020-03-27 Oppo广东移动通信有限公司 Image data frame insertion processing method and device, electronic equipment and storage medium
CN111225150A (en) * 2020-01-20 2020-06-02 Oppo广东移动通信有限公司 Method for processing interpolation frame and related product

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114285956A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video sharing circuit, method and device and electronic equipment
CN114285978A (en) * 2021-12-28 2022-04-05 维沃移动通信有限公司 Video processing method, video processing device and electronic equipment
CN116886961A (en) * 2023-09-06 2023-10-13 中移(杭州)信息技术有限公司 Distributed live video frame inserting method, device, system and storage medium
CN116886961B (en) * 2023-09-06 2023-12-26 中移(杭州)信息技术有限公司 Distributed live video frame inserting method, device, system and storage medium

Similar Documents

Publication Publication Date Title
CN108174248B (en) Video playing method, video playing control device and storage medium
CN111918098A (en) Video processing method and device, electronic equipment, server and storage medium
EP3046331B1 (en) Media control method and system based on cloud desktop
WO2020248909A1 (en) Video decoding method and apparatus, computer device, and storage medium
US10313207B2 (en) Method for testing cloud streaming server, and apparatus and system therefor
US20140375758A1 (en) Method and apparatus for dynamically adjusting aspect ratio of images during a video call
US20180373736A1 (en) Method and apparatus for storing resource and electronic device
CN112843676B (en) Data processing method, device, terminal, server and storage medium
WO2017202175A1 (en) Method and device for video compression and electronic device
US20230285854A1 (en) Live video-based interaction method and apparatus, device and storage medium
CN113542757A (en) Image transmission method and device for cloud application, server and storage medium
US11412311B2 (en) Methods and systems for saving data while streaming video
CN114816308B (en) Information partition display method and related equipment
CN109587555B (en) Video processing method and device, electronic equipment and storage medium
US20240098316A1 (en) Video encoding method and apparatus, real-time communication method and apparatus, device, and storage medium
US20150181167A1 (en) Electronic device and method for video conference management
CN111225242A (en) Video playing disaster tolerance method and device and computer storage medium
US20160301736A1 (en) Systems and methods for providing remote access to an application
CN113452948B (en) Conference terminal control method, device, equipment and storage medium
KR20160131827A (en) System for cloud streaming service, method of image cloud streaming service using alpha level of color bit and apparatus for the same
CN110083321B (en) Content display method and device, intelligent screen projection terminal and readable storage medium
US20190158892A1 (en) Server structure for supporting multiple sessions of virtualization
CN113596583A (en) Video stream bullet time data processing method and device
CN110673919A (en) Screen capturing method and device
KR102369525B1 (en) Apparatus, system, and method for remote connection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201110