CN111327959A - Video frame insertion method and related device - Google Patents

Video frame insertion method and related device Download PDF

Info

Publication number
CN111327959A
CN111327959A CN202010148399.3A CN202010148399A CN111327959A CN 111327959 A CN111327959 A CN 111327959A CN 202010148399 A CN202010148399 A CN 202010148399A CN 111327959 A CN111327959 A CN 111327959A
Authority
CN
China
Prior art keywords
video
data
video data
user interface
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010148399.3A
Other languages
Chinese (zh)
Inventor
胡杰
林文真
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010148399.3A priority Critical patent/CN111327959A/en
Publication of CN111327959A publication Critical patent/CN111327959A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter

Abstract

The embodiment of the application provides a video frame insertion method and a related device, which are applied to electronic equipment, wherein the method comprises the following steps: acquiring a first data stream; determining whether first video data and user interface data are present in the first data stream at the same time; if so, separating the first video data and the user interface data; and respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream. Therefore, the video frame interpolation method and the related device provided by the embodiment of the application can solve the problem that the color block of the user interface layer of the application program is abnormal when the frame interpolation is performed on the video scene where the video and the user interface coexist.

Description

Video frame insertion method and related device
Technical Field
The present application relates to the field of communications technologies, and in particular, to a video frame interpolation method and a related apparatus.
Background
The method currently used by video compression or video codecs to reduce spatial redundancy in video sequences is the Motion Compensation (MEMC) video frame interpolation method, where Motion Compensation is a method describing differences between adjacent frames, where the adjacent frames are adjacent in coding relation, and two frames are not necessarily adjacent in playing order, and specifically, how each small block of a previous frame moves to a certain position in a current frame, and Motion Compensation can also be used to perform operations of de-interleaving and Motion interpolation.
Disclosure of Invention
The embodiment of the application provides a video frame interpolation method and a related device, which can independently interpolate video data in a video scene.
In a first aspect, an embodiment of the present application provides a video frame interpolation method, which is applied to an electronic device, and the method includes:
acquiring a first data stream;
determining whether first video data and user interface data are present in the first data stream at the same time;
if so, separating the first video data and the user interface data;
and respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream.
In a second aspect, the present application provides a video frame interpolation apparatus, which is applied to an electronic device, and includes a processing unit and a communication unit, wherein,
the processing unit is used for acquiring a first data stream through the communication unit; and means for determining whether first video data and user interface data are present in the first data stream at the same time; if so, separating the first video data and the user interface data; and the video processing device is used for respectively processing the first video data and the user interface data so as to complete the frame insertion processing of the first data stream.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, the electronic device first acquires a first data stream; then determining whether first video data and user interface data are simultaneously present in the first data stream; if so, separating the first video data and the user interface data; and finally, respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream. Therefore, the video frame interpolation method and the related device provided by the embodiment of the application can solve the problem that the color block of the user interface layer of the application program is abnormal when the frame interpolation is performed on the video scene where the video and the user interface coexist.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1a is a hardware block diagram of a video frame interpolation method according to an embodiment of the present disclosure;
fig. 1b is a software block diagram of a video frame interpolation method according to an embodiment of the present application;
fig. 2a is a schematic flowchart of a video frame interpolation method according to an embodiment of the present application;
FIG. 2b is a schematic diagram of a single video application scene provided by an embodiment of the present application;
fig. 2c is a schematic view of a split-screen multi-video application scene provided by an embodiment of the present application;
fig. 3 is a schematic flowchart of another video frame interpolation method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
fig. 5 is a block diagram illustrating functional units of a video frame interpolation apparatus according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
At present, the existing video frame interpolation algorithm in the market is to perform frame interpolation on the whole display output, and when a User Interface (UI) and a video exist in the display output, the UI is influenced by the frame interpolation algorithm, and color block abnormity occurs.
In view of the above-mentioned problems, embodiments of the present application provide a video frame interpolation method and a related apparatus, and the following describes the embodiments in detail with reference to the accompanying drawings.
As shown in fig. 1a, fig. 1a is a hardware block diagram of a video frame interpolation method according to an embodiment of the present disclosure. The hardware block diagram 100 includes a data pipeline, and a display layer post-processing unit, a serial interface, a digital signal processor, a liquid crystal display, and the like, which are connected in sequence, where the data pipeline may be a data pipeline dedicated to transmitting video YUV data, when video data and UI data coexist in a data stream, the data pipeline is respectively used for transmitting video data and UI data, the first display layer post-processing unit is used for receiving and processing the video data, the second display layer post-processing unit is used for receiving and processing the UI data, and respectively transmits the video data and the UI data to the digital signal processor through the first display serial interface and the second display serial interface, the digital signal processor performs frame insertion, decoding, and the like on the video data and the UI data, and finally transmits the processed data to the liquid crystal display for output.
As shown in fig. 1b, fig. 1b is a software block diagram of a video frame interpolation method according to an embodiment of the present application, when the electronic device acquires a first data stream that includes both video data and user interface data, and wherein the video layer and the user interface layer are layer separated, after the first data stream is processed by front-end policy such as frame rate detection, layered detection, etc., layer or data type characteristics such as color coding schemes of different layers can be identified by a display composition service, and a video layer and a user interface layer are separated, and then distributing the video data to a first display unit through a code of the middleware, distributing the user interface data to a second display unit, wherein the display unit specifically comprises a layer mixer, a display layer post-processing unit, a display compressor, a display communication interface and the like, and finally transmitting the processed data to a display driving end for displaying. Of course, if the video data and the user interface data included in the first data stream acquired by the electronic device are layer-stacked, the first data stream cannot be hierarchically processed through the display composition service, and the first data stream can be uniformly allocated to the first display unit, and the processed data is delivered to the display driving end for display.
Referring to fig. 2a, fig. 2a is a schematic flowchart of a video frame interpolation method according to an embodiment of the present disclosure, and as shown in the drawing, the video frame interpolation method is described from the perspective of an electronic device, and specifically includes the following steps.
S201, a first data stream is obtained.
The first data stream mainly includes data that needs to be displayed through a liquid crystal display screen, the first data stream may include only a single type of data, or may include a mixture of multiple types of data, and the first data stream may include data transmitted through a data pipeline provided for video YUV data, or may include data transmitted through a data pipeline provided for normal RGB data.
S202, determining whether the first video data and the user interface data exist in the first data stream at the same time.
In the process of playing a video by a video application, video data and user interface data may exist at the same time, and the user interface may be an interface for operating functions such as recording a video and locking a screen according to the progress of the video, for example, an interface for operating to pause playing the video, playing a next video, and adjusting the definition of the video. When the data type is judged, the characteristic identification can be carried out through the display synthesis service, different data types are distinguished according to the characteristics of the data, and whether the video data and the user interface data exist in the data stream at the same time or not is judged.
S203, if yes, separating the first video data and the user interface data.
If the first video data and the user interface data are judged to exist in the first data stream at the same time, the first video data and the user interface data need to be separated, so that the video data can be independently subjected to frame insertion processing in the following process.
S204, the first video data and the user interface data are processed respectively, so that frame insertion processing of the first data stream is completed.
After the first video data and the user interface data are separated, frame insertion processing and decoding processing can be respectively carried out on the first video data and the user interface data through the digital signal processor, a frame image can be added between adjacent frames according to a frame insertion algorithm in the video frame insertion method, so that the picture is smoother when the video is played, and finally the processed first video data can be sent to the liquid crystal display for display and output.
It can be seen that, in the embodiment of the present application, the electronic device first acquires a first data stream; then determining whether first video data and user interface data are simultaneously present in the first data stream; if so, separating the first video data and the user interface data; and finally, respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream. Therefore, the video frame interpolation method provided by the embodiment of the application can solve the problem that color blocks are abnormal in the user interface layer of an application program when the frame interpolation is performed on a video scene where a video and a user interface coexist.
In one possible example, the determining whether first video data and user interface data are simultaneously present in the first data stream comprises: acquiring color coding schemes of all layers contained in the first data stream; determining the layer type of the layer according to the color coding scheme, wherein the layer type comprises a video layer and a user interface layer; and determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer type contained in the first data stream.
Specifically, it is found through analysis that the layer with the color coding scheme being YCBCR _420 is a video layer, and the layer with the color coding scheme being RGBA8888 is a user interface layer. Where YCBCR is a color coding scheme commonly used in consumer video products such as camcorders, digital televisions, etc., where Y refers to a luminance component, CB refers to a blue chrominance component, and CR refers to a red chrominance component, YCBCR _420 indicates that there are four luminance components per four pixels, 2 chrominance components, or yyycbcr. In the RGBA color coding scheme, R represents a red chrominance component, G represents a green chrominance component, B represents a blue chrominance component, a represents transparency (Alpha), an Alpha channel is generally used as an opacity parameter, if the value of the Alpha channel of a pixel is 0%, it is completely transparent, i.e. invisible, and the value is 100%, it means a completely opaque pixel traditional digital image, when recording and displaying a color image, the RGBA color coding scheme is a common scheme, RGBA8888 means that there are four parameters, i.e. each parameter in A, R, G, B is represented by 8 bits (bit), and a pixel point has 32 bits in total. If it is detected that both coding schemes are present in the first data stream at the same time according to the characteristics of the color coding scheme, it means that both video layer and user interface layer are present in the first data stream.
It can be seen that, in this example, the type of the layer in the data stream is determined according to the color coding scheme, so as to determine whether video data and user interface data exist in the data stream at the same time.
In one possible example, the separating the first video data and the user interface data includes: distributing the first video data to a first display layer post-processing unit; determining other data in the first data stream except the first video data as a second data stream, wherein the second data stream comprises the user interface data; and distributing the second data stream to a second display layer post-processing unit.
After determining that video data and a user interface exist in a data stream at the same time, the video data and the user interface in the data stream need to be processed separately, the video data can be distributed into a first display layer post-processing unit, the user interface data is distributed into a second display layer post-processing unit, so that the video data and the user interface data can be processed respectively in the following, the data can be modified and processed respectively in the display layer post-processing unit, such as enhancement layer brightness value, and the processed data is transmitted to a digital signal processor through a first display serial interface and a second display serial interface respectively. Of course, if the first data stream only includes video data and does not include user interface data at the same time, the first data stream may be directly allocated to the first display layer post-processing unit for operation processing, and the processed data is transmitted to the digital signal processor through the first display serial interface.
Therefore, in this example, the video data and the user interface data are respectively allocated to the two display layer post-processing units for data processing, so that the phenomenon that the color block of the user interface layer is abnormal when the video data is subsequently subjected to frame interpolation processing can be avoided.
In one possible example, the electronic device is in a single video application playback scenario, and the separately processing the first video data and the user interface data comprises: determining a frame rate of the first video data and a screen refresh rate of the electronic device; determining an interpolation scheme according to the frame rate and the screen refresh rate; and sending the first video data to a digital signal processor for frame interpolation according to the frame interpolation scheme.
The frame rate is the speed of changing the picture, the frame rate can be very high as long as the display card is sufficiently supported, and the picture can be smooth as long as the frame rate is high, theoretically, each frame is a different picture, for example, 60 frames per second (fps) is 60 pictures generated by the display card per second. The screen refresh rate is the rate at which the display card refreshes the display signal output, for example, 60 hertz (hertz) is the rate at which the display card outputs 60 signals to the display per second. Assuming that the frame number is 1/2 of the refresh rate, it means that the picture output to the display by the display card every two times is one picture, and conversely, if the frame number is 2 times of the refresh rate, the picture changes every two times, wherein only 1 time is sent by the display card and displayed on the display, so the frame numbers higher than the refresh rate are all invalid frame numbers, there is no improvement to the picture effect, but may cause picture abnormity, therefore, the frame interpolation scheme determined according to the frame rate and the screen refresh rate may be that the frame rate of the video after frame interpolation is an integer multiple of the screen refresh rate, and/or the frame rate of the video after frame interpolation is not higher than the screen refresh rate.
In a single video application scene, only one piece of video data needs to be subjected to frame interpolation processing, so that different frame interpolation schemes can be determined according to the video quality, the user requirement and the hardware capability of the electronic equipment, for example, frame interpolation processing is performed on the whole video, frame interpolation is performed every few frames, or frame interpolation is performed only on a certain time period in the video data.
Therefore, in this example, the frame interpolation scheme is determined according to the video frame rate and the screen refresh rate of the electronic device, so that the quality of the video after frame interpolation can be ensured, the viewing experience of a user can be improved, and the resources of the digital signal processor can be saved.
In one possible example, the electronic device is in a split-screen multi-video application playing scene, and the separately processing the first video data and the user interface data includes: determining second video data which needs to be subjected to frame interpolation processing in the first video data, wherein the second video data comprises video data of a first video to be played in a first screen division area and/or video data of a second video to be played in a second screen division area; and sending the second video data to a digital signal processor for frame interpolation.
In a split-screen multi-video application playing scene, the electronic device may only play one video, that is, only one video data is in the first data stream, or the electronic device may simultaneously play videos in both the first split-screen area and the second split-screen area, that is, two video data may exist in the first data stream at the same time. When there are two video data in the first data stream, there may be both video data and user interface data in only one split screen area, or of course there may be both video data and user interface data in both split screen areas. Therefore, when the electronic device is in a split-screen multi-video playing scene, after the video data and the user interface data are separated, it is further required to determine which one or both of the video data need to be subjected to frame interpolation processing, and then send the second video data to be subjected to frame interpolation processing to the digital signal processor for frame interpolation.
Therefore, in this example, when the electronic device is in a split-screen multi-video application scene, the second video data that needs to be subjected to frame interpolation processing is determined first, and then the second video data is sent to the digital signal processor for frame interpolation, so that the video data processing efficiency can be improved, and the resources of the digital signal processor can be saved.
In one possible example, the second video data includes video data of the first video and the second video, and the sending the second video data to a digital signal processor for frame interpolation processing includes: respectively determining the proportions of the moving image of the first video and the moving image of the second video; respectively determining frame interpolation rates of the first video and the second video according to the proportion of the motion images, wherein the frame interpolation rates are positively associated with the proportion of the motion images; and respectively sending the video data of the first video and the second video to a data signal processor for frame interpolation according to the frame interpolation rate.
When the videos in the first split-screen area and the second split-screen area both need to be subjected to frame interpolation processing, the ratio of the motion image of the first video in the current first split-screen area to the motion image of the second video in the second split-screen area may be determined first, and if the ratio of the motion image of the first video is higher than that of the motion image of the second video, when the first video and the second video are subjected to frame interpolation, the frame interpolation rate of the first video is higher than that of the second video in the current time period, and the frame interpolation rate of the videos is increased along with the increase of the ratio of the motion images.
As can be seen, in this example, the frame interpolation rate is determined according to the ratio of the moving images of the first video and the second video, so that the time delay of video output can be reduced, and the frame interpolation efficiency can be improved.
In one possible example, the determining second video data, which needs to be subjected to frame interpolation processing, in the first video data includes: acquiring a frame rate of the first video data; and determining the first video data with the frame rate not reaching the preset frame rate as second video data needing frame interpolation according to the screen refresh rate.
When determining whether the video in the first split screen area and the video in the second split screen area need frame interpolation, the frame interpolation can be determined according to the frame rates of the first video and the second video and the screen refresh rate of the electronic device, and if the frame rate of the video data is lower than the screen refresh rate of the electronic device and/or the frame rate of the video data is not an integral multiple of the screen refresh rate of the electronic device, the video data can be determined as the second video data which needs frame interpolation.
As can be seen, in the present example, in a split-screen multi-video application playing scene, the second video data that needs to be subjected to frame interpolation processing is determined according to the frame rate of each video and the screen refresh rate of the electronic device, so that the quality of the video after frame interpolation can be ensured, the viewing experience of a user can be improved, and the resources of the digital signal processor can be saved.
The following is a detailed description of specific embodiments.
As shown in fig. 2b, fig. 2b is a schematic view of a single video application scene provided in this embodiment of the application, where an electronic device takes a mobile phone as an example, and the mobile phone only plays one video and a user interface exists in the video at the same time, a frame interpolation scheme is selected according to the frame number of the played video and the screen refresh rate of the mobile phone, if a current playing video is 30 frames per second and the screen refresh rate of the mobile phone is 120 hz, the whole video may be interpolated to 120 frames per second or 60 frames per second, or the frame interpolation scheme may be selected according to specific video content, for example, motion images in only the 15 th minute to the 30 th minute in the current playing video are more than motion images, and the frame interpolation may be performed only for the 15 minutes without performing frame interpolation processing on the whole video.
As shown in fig. 2c, fig. 2c is a schematic view of a split-screen multi-video application scene provided in the embodiment of the present application, where the electronic device is a mobile phone, and when the mobile phone is in the split-screen scene, when two videos which need to be subjected to frame interpolation processing are played at the same time, in order to reduce the time delay problem when the videos are output after frame interpolation, the interpolation frame rate may be determined according to the ratio of the moving images of the two videos currently being played, for example, within the current minute, 50% of the first videos of the first split screen area are moving images, and only 10% of the second video of the second split screen area is a moving picture, it can be determined that the moving picture proportion of the first video is five sixths, and the moving picture proportion of the second video is one sixth, and therefore, most of the time in this minute is used for the interpolation of the first video and a small portion of the time is used for the interpolation of the second video.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating another video frame interpolation method according to an embodiment of the present disclosure, where the video frame interpolation method includes the following steps:
s301, acquiring a first data stream;
s302, acquiring color coding schemes of all layers contained in the first data stream;
s303, determining the layer type of the layer according to the color coding scheme, wherein the layer type comprises a video layer and a user interface layer;
s304, determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer types contained in the first data stream;
s305, if yes, separating the first video data and the user interface data;
s306, processing the first video data and the user interface data respectively to complete the frame insertion processing of the first data stream.
It can be seen that, in the embodiment of the present application, the electronic device first acquires a first data stream; then, acquiring color coding schemes of all layers contained in the first data stream, and determining the layer types of the layers according to the color coding schemes; then determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer type contained in the first data stream; if so, separating the first video data and the user interface data; and finally, respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream. Therefore, the video frame interpolation method provided by the embodiment of the application can solve the problem that color blocks are abnormal in the user interface layer of an application program when the frame interpolation is performed on a video scene where a video and a user interface coexist.
Consistent with the embodiments shown in fig. 2a and fig. 3, please refer to fig. 4, and fig. 4 is a schematic structural diagram of an electronic device 400 provided in an embodiment of the present application, and as shown in the figure, the electronic device 400 includes an application processor 410, a memory 420, a communication interface 430, and one or more programs 421, where the one or more programs 421 are stored in the memory 420 and configured to be executed by the application processor 410, and the one or more programs 421 include instructions for executing any step in the foregoing method embodiments.
In one possible example, the program 421 includes instructions for performing the following steps: acquiring a first data stream; determining whether first video data and user interface data are present in the first data stream at the same time; if so, separating the first video data and the user interface data; and respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream.
In one possible example, in the determining whether the first video data and the user interface data are present in the first data stream at the same time, the instructions in the program 421 are specifically configured to: acquiring color coding schemes of all layers contained in the first data stream; determining the layer type of the layer according to the color coding scheme, wherein the layer type comprises a video layer and a user interface layer; and determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer type contained in the first data stream.
In one possible example, in the aspect of the separating the first video data and the user interface data, the instructions in the program 421 are specifically configured to: distributing the first video data to a first display layer post-processing unit; determining other data in the first data stream except the first video data as a second data stream, wherein the second data stream comprises the user interface data; and distributing the second data stream to a second display layer post-processing unit.
In one possible example, where the electronic device is in a single video application playback scenario, the instructions in the program 421 are specifically configured to perform the following operations in the aspects of separately processing the first video data and the user interface data: determining a frame rate of the first video data and a screen refresh rate of the electronic device; determining an interpolation scheme according to the frame rate and the screen refresh rate; and sending the first video data to a digital signal processor for frame interpolation according to the frame interpolation scheme.
In one possible example, the electronic device is in a split-screen multi-video application playing scenario, and in terms of the processing the first video data and the user interface data respectively, the instructions in the program 421 are specifically configured to: determining second video data which needs to be subjected to frame interpolation processing in the first video data, wherein the second video data comprises video data of a first video to be played in a first screen division area and/or video data of a second video to be played in a second screen division area; and sending the second video data to a digital signal processor for frame interpolation.
In one possible example, the second video data includes video data of the first video and the second video, and in terms of the sending the second video data to a digital signal processor for frame interpolation, the instructions in the program 421 are specifically configured to perform the following operations: respectively determining the proportions of the moving image of the first video and the moving image of the second video; respectively determining frame interpolation rates of the first video and the second video according to the proportion of the motion images, wherein the frame interpolation rates are positively associated with the proportion of the motion images; and respectively sending the video data of the first video and the second video to a data signal processor for frame interpolation according to the frame interpolation rate.
In one possible example, in the aspect of determining the second video data that needs to be subjected to frame interpolation processing in the first video data, the instructions in the program 421 are specifically configured to perform the following operations: acquiring a frame rate of the first video data; and determining the first video data with the frame rate not reaching the preset frame rate as second video data needing frame interpolation according to the screen refresh rate.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Fig. 5 is a block diagram illustrating functional units of a video frame interpolation apparatus 500 according to an embodiment of the present disclosure. The video frame interpolation apparatus 500 is applied to an electronic device, and includes a processing unit 501 and a communication unit 502, wherein,
the processing unit 501 is configured to obtain a first data stream through the communication unit 502; and means for determining whether first video data and user interface data are present in the first data stream at the same time; if so, separating the first video data and the user interface data; and the video processing device is used for respectively processing the first video data and the user interface data so as to complete the frame insertion processing of the first data stream.
In one possible example, in the aspect of determining whether the first video data and the user interface data exist in the first data stream at the same time, the processing unit 501 is specifically configured to obtain a color coding scheme of all layers included in the first data stream; determining the layer type of the layer according to the color coding scheme, wherein the layer type comprises a video layer and a user interface layer; and determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer type contained in the first data stream.
In a possible example, in terms of the separating the first video data and the user interface data, the processing unit 501 is specifically configured to allocate the first video data to a first display layer post-processing unit; determining other data in the first data stream except the first video data as a second data stream, wherein the second data stream comprises the user interface data; and distributing the second data stream to a second display layer post-processing unit.
In one possible example, the electronic device is in a single video application playing scene, and in the aspect of separately processing the first video data and the user interface data, the processing unit 501 is specifically configured to determine a frame rate of the first video data and a screen refresh rate of the electronic device; determining an interpolation scheme according to the frame rate and the screen refresh rate; and sending the first video data to a digital signal processor for frame interpolation according to the frame interpolation scheme.
In a possible example, the electronic device is in a split-screen multi-video application playing scene, and in the aspect of separately processing the first video data and the user interface data, the processing unit 501 is specifically configured to determine second video data that needs to be subjected to frame interpolation processing in the first video data, where the second video data includes video data of a first video to be played in a first split-screen area and/or video data of a second video to be played in a second split-screen area; and sending the second video data to a digital signal processor for frame interpolation.
In a possible example, the second video data includes video data of the first video and the second video, and in terms of the sending the second video data to a digital signal processor for frame interpolation processing, the processing unit 501 is specifically configured to determine proportions of a moving image of the first video and a moving image of the second video, respectively; respectively determining frame interpolation rates of the first video and the second video according to the proportion of the motion images, wherein the frame interpolation rates are positively associated with the proportion of the motion images; and respectively sending the video data of the first video and the second video to a data signal processor for frame interpolation according to the frame interpolation rate.
In one possible example, in the aspect of determining second video data that needs to be subjected to frame interpolation processing in the first video data, the processing unit 501 is specifically configured to obtain a frame rate of the first video data; and determining the first video data with the frame rate not reaching the preset frame rate as second video data needing frame interpolation according to the screen refresh rate.
The video frame interpolation apparatus 500 may further include a storage unit 503 for storing program codes and data of the electronic device. The processing unit 501 may be a processor, the communication unit 502 may be a touch display screen or a transceiver, and the storage unit 503 may be a memory.
It can be understood that, since the method embodiment and the apparatus embodiment are different presentation forms of the same technical concept, the content of the method embodiment portion in the present application should be synchronously adapted to the apparatus embodiment portion, and is not described herein again.
Embodiments of the present application further provide a chip, where the chip includes a processor, configured to call and run a computer program from a memory, so that a device in which the chip is installed performs some or all of the steps described in the electronic device in the above method embodiments.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A video frame interpolation method is applied to an electronic device, and comprises the following steps:
acquiring a first data stream;
determining whether first video data and user interface data are present in the first data stream at the same time;
if so, separating the first video data and the user interface data;
and respectively processing the first video data and the user interface data to complete the frame insertion processing of the first data stream.
2. The method of claim 1, wherein determining whether first video data and user interface data are present in the first data stream at the same time comprises:
acquiring color coding schemes of all layers contained in the first data stream;
determining the layer type of the layer according to the color coding scheme, wherein the layer type comprises a video layer and a user interface layer;
and determining whether first video data and user interface data simultaneously exist in the first data stream according to the layer type contained in the first data stream.
3. The method of claim 2, wherein the separating the first video data and the user interface data comprises:
distributing the first video data to a first display layer post-processing unit;
determining other data in the first data stream except the first video data as a second data stream, wherein the second data stream comprises the user interface data;
and distributing the second data stream to a second display layer post-processing unit.
4. The method of any of claims 1-3, wherein the electronic device is in a single video application playback scenario, and wherein the separately processing the first video data and the user interface data comprises:
determining a frame rate of the first video data and a screen refresh rate of the electronic device;
determining an interpolation scheme according to the frame rate and the screen refresh rate;
and sending the first video data to a digital signal processor for frame interpolation according to the frame interpolation scheme.
5. The method of any of claims 1-3, wherein the electronic device is in a split-screen multi-video application playback scenario, and wherein the separately processing the first video data and the user interface data comprises:
determining second video data which needs to be subjected to frame interpolation processing in the first video data, wherein the second video data comprises video data of a first video to be played in a first screen division area and/or video data of a second video to be played in a second screen division area;
and sending the second video data to a digital signal processor for frame interpolation.
6. The method of claim 5, wherein the second video data comprises video data of the first video and the second video, and wherein sending the second video data to a digital signal processor for frame interpolation comprises:
respectively determining the proportions of the moving image of the first video and the moving image of the second video;
respectively determining frame interpolation rates of the first video and the second video according to the proportion of the motion images, wherein the frame interpolation rates are positively associated with the proportion of the motion images;
and respectively sending the video data of the first video and the second video to a data signal processor for frame interpolation according to the frame interpolation rate.
7. The method of claim 5, wherein the determining second video data that needs to be subjected to frame interpolation processing in the first video data comprises:
acquiring a frame rate of the first video data;
and determining the first video data with the frame rate not reaching the preset frame rate as second video data needing frame interpolation according to the screen refresh rate.
8. A video frame interpolation apparatus applied to an electronic device, the apparatus comprising a processing unit and a communication unit, wherein,
the processing unit is used for acquiring a first data stream through the communication unit; and means for determining whether first video data and user interface data are present in the first data stream at the same time; if so, separating the first video data and the user interface data; and the video processing device is used for respectively processing the first video data and the user interface data so as to complete the frame insertion processing of the first data stream.
9. An electronic device comprising a processor, a memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN202010148399.3A 2020-03-05 2020-03-05 Video frame insertion method and related device Pending CN111327959A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010148399.3A CN111327959A (en) 2020-03-05 2020-03-05 Video frame insertion method and related device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010148399.3A CN111327959A (en) 2020-03-05 2020-03-05 Video frame insertion method and related device
PCT/CN2021/073974 WO2021175049A1 (en) 2020-03-05 2021-01-27 Video frame interpolation method and related apparatus

Publications (1)

Publication Number Publication Date
CN111327959A true CN111327959A (en) 2020-06-23

Family

ID=71171554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010148399.3A Pending CN111327959A (en) 2020-03-05 2020-03-05 Video frame insertion method and related device

Country Status (2)

Country Link
CN (1) CN111327959A (en)
WO (1) WO2021175049A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021175049A1 (en) * 2020-03-05 2021-09-10 Oppo广东移动通信有限公司 Video frame interpolation method and related apparatus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010010778A (en) * 2008-06-24 2010-01-14 Canon Inc Video processing apparatus and method for controlling the video processing apparatus
US20110271299A1 (en) * 2010-04-29 2011-11-03 Srikanth Kakani Method and apparatus for insertion of advertising in a live video stream
CN102685437A (en) * 2012-02-03 2012-09-19 深圳市创维群欣安防科技有限公司 Method and monitor for compensating video image
CN203313319U (en) * 2013-06-09 2013-11-27 深圳创维-Rgb电子有限公司 Display system
CN108810281A (en) * 2018-06-22 2018-11-13 Oppo广东移动通信有限公司 Frame losing compensation method, device, storage medium and terminal
CN110086905A (en) * 2018-03-26 2019-08-02 华为技术有限公司 A kind of kinescope method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135847A (en) * 2007-12-03 2009-06-18 Hitachi Ltd Video processor and frame rate conversion method
CN102833518B (en) * 2011-06-13 2015-07-08 华为终端有限公司 Method and device for optimally configuring MCU (multipoint control unit) multipicture
US9703446B2 (en) * 2014-02-28 2017-07-11 Prezi, Inc. Zooming user interface frames embedded image frame sequence
CN106933328B (en) * 2017-03-10 2020-04-17 Oppo广东移动通信有限公司 Method and device for controlling frame rate of mobile terminal and mobile terminal
CN109275011B (en) * 2018-09-03 2020-12-04 青岛海信传媒网络技术有限公司 Processing method and device for switching motion modes of smart television and user equipment
CN109803175B (en) * 2019-03-12 2021-03-26 京东方科技集团股份有限公司 Video processing method and device, video processing equipment and storage medium
CN111327959A (en) * 2020-03-05 2020-06-23 Oppo广东移动通信有限公司 Video frame insertion method and related device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010010778A (en) * 2008-06-24 2010-01-14 Canon Inc Video processing apparatus and method for controlling the video processing apparatus
US20110271299A1 (en) * 2010-04-29 2011-11-03 Srikanth Kakani Method and apparatus for insertion of advertising in a live video stream
CN102685437A (en) * 2012-02-03 2012-09-19 深圳市创维群欣安防科技有限公司 Method and monitor for compensating video image
CN203313319U (en) * 2013-06-09 2013-11-27 深圳创维-Rgb电子有限公司 Display system
CN110086905A (en) * 2018-03-26 2019-08-02 华为技术有限公司 A kind of kinescope method and electronic equipment
CN108810281A (en) * 2018-06-22 2018-11-13 Oppo广东移动通信有限公司 Frame losing compensation method, device, storage medium and terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021175049A1 (en) * 2020-03-05 2021-09-10 Oppo广东移动通信有限公司 Video frame interpolation method and related apparatus

Also Published As

Publication number Publication date
WO2021175049A1 (en) 2021-09-10

Similar Documents

Publication Publication Date Title
US20190297362A1 (en) Downstream video composition
CN103841389B (en) A kind of video broadcasting method and player
US20150381961A1 (en) Creating three dimensional graphics data
CN101669361B (en) Methods and systems for improving low resolution and low frame rate video
KR101492535B1 (en) Method, apparatus and system for generating and facilitating mobile high-definition multimedia interface
JP6921815B2 (en) How and Devices to Adapt Video Content Decrypted from Elementary Streams to Display Characteristics
EP2717254A1 (en) Content processing apparatus for processing high resolution content and content processing method thereof
US10102878B2 (en) Method, apparatus and system for displaying images
CN105892976B (en) Realize the method and device of multi-screen interactive
JP2012516069A (en) Method and system for transmitting and combining 3D video and 3D overlay over a video interface
EP3471405A1 (en) Video signal transmission method and apparatus
KR102043962B1 (en) Low latency screen mirroring
US20120120312A1 (en) Image synthesizing device, coding device, program, and recording medium
WO2021175049A1 (en) Video frame interpolation method and related apparatus
CN104954812A (en) Video synchronized playing method, device and system
CN101998083A (en) Video processing device
CN105979370B (en) A kind of method and device configuring image model
CN109640167B (en) Video processing method and device, electronic equipment and storage medium
CN108235055B (en) Method and device for realizing transparent video in AR scene
US9094712B2 (en) Video processing device, display device and video processing method
US9094664B2 (en) Image processing device, image processing method, and program
US6766104B2 (en) Apparatus and method for playing back video
KR20200110024A (en) A display apparatus and a method for operating the display apparatus
US20140056524A1 (en) Image processing device, image processing method, and program
CN113450293A (en) Video information processing method, device and system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination