WO2021175049A1 - 视频插帧方法及相关装置 - Google Patents

视频插帧方法及相关装置 Download PDF

Info

Publication number
WO2021175049A1
WO2021175049A1 PCT/CN2021/073974 CN2021073974W WO2021175049A1 WO 2021175049 A1 WO2021175049 A1 WO 2021175049A1 CN 2021073974 W CN2021073974 W CN 2021073974W WO 2021175049 A1 WO2021175049 A1 WO 2021175049A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
data
video data
user interface
frame
Prior art date
Application number
PCT/CN2021/073974
Other languages
English (en)
French (fr)
Inventor
胡杰
林文真
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021175049A1 publication Critical patent/WO2021175049A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440281Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the temporal resolution, e.g. by frame skipping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter

Definitions

  • This application relates to the field of communication technology, and in particular to a video frame insertion method and related devices.
  • Motion compensation is a method of describing the difference between adjacent frames.
  • the adjacent means that they are adjacent in the coding relationship.
  • Two frames are not necessarily adjacent in the playback order. Specifically, it describes how each small block of the previous frame moves to a certain position in the current frame.
  • Motion compensation It can also be used for de-interleaving and motion interpolation operations.
  • the embodiments of the present application provide a video frame interpolation method and related devices, which can independently perform frame interpolation on video data in a video scene.
  • an embodiment of the present application provides a video frame insertion method, which is applied to an electronic device, and the method includes:
  • the first video data and the user interface data are processed separately to complete frame insertion processing on the first data stream.
  • an embodiment of the present application provides a video frame insertion device, which is applied to an electronic device, and the device includes a processing unit and a communication unit, wherein:
  • the processing unit is used to obtain the first data stream through the communication unit; and used to determine whether the first video data and the user interface data exist at the same time in the first data stream; if so, separate the first video data And the user interface data; and used to process the first video data and the user interface data separately to complete the frame insertion processing of the first data stream.
  • an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured to be processed by the above
  • the above program includes instructions for executing the steps in any method in the first aspect of the embodiments of the present application.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the above-mentioned computer-readable storage medium stores a computer program for electronic data exchange, wherein the above-mentioned computer program enables a computer to execute On the one hand, part or all of the steps described in any method.
  • the embodiments of the present application provide a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute as implemented in this application.
  • the computer program product may be a software installation package.
  • the electronic device first obtains the first data stream; then determines whether the first video data and the user interface data exist in the first data stream at the same time; if so, the first video data is separated And the user interface data; finally, the first video data and the user interface data are processed separately to complete the frame insertion processing of the first data stream.
  • the video frame interpolation method and related device provided by the embodiments of the present application can solve the problem of abnormal color blocks in the user interface layer of the application when the video scene coexisting with the video and the user interface is inserted into the frame.
  • Figure 1a is a hardware block diagram of a video frame insertion method provided by an embodiment of the present application.
  • Figure 1b is a software block diagram of a video frame interpolation method provided by an embodiment of the present application.
  • Figure 2a is a schematic flowchart of a video frame insertion method provided by an embodiment of the present application.
  • Figure 2b is a schematic diagram of a single video application scenario provided by an embodiment of the present application.
  • Figure 2c is a schematic diagram of a split-screen multi-video application scenario provided by an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of another video frame insertion method provided by an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Fig. 5 is a block diagram of functional units of a video frame interpolation device provided by an embodiment of the present application.
  • the existing video frame interpolation algorithms on the market are for the entire display output for frame interpolation.
  • the user interface (UI) and video are present in the display output, the UI is affected by the frame interpolation algorithm, and colors will appear.
  • the block is abnormal.
  • an embodiment of the present application provides a video frame interpolation method and related devices. The following describes this embodiment in detail with reference to the accompanying drawings.
  • Fig. 1a is a hardware block diagram of a video frame insertion method provided by an embodiment of the present application.
  • the hardware block diagram 100 shown includes a data pipeline, a display layer post-processing unit, serial interface, digital signal processor, and liquid crystal display, etc., which are sequentially connected.
  • the data pipeline may be a data pipeline dedicated to transmitting video YUV data.
  • the data pipeline is used to transmit video data and UI data, respectively, the first display layer post-processing unit is used to receive and process video data, and the second display layer post-processing unit Used to receive and process UI data, and respectively transmit the video data and UI data to the digital signal processor through the first display serial interface and the second display serial interface, and insert the frames of the video data and UI data through the digital signal processor , Decoding and other processing, and finally the processed data is transmitted to the LCD display for output.
  • Figure 1b is a software block diagram of a video frame insertion method provided by an embodiment of the present application.
  • the display composition service can identify the layer or data type characteristics such as the color coding scheme of different layers , And separate the video layer and the user interface layer, and then distribute the video data to the first display unit and the user interface data to the second display unit through the code of the middleware.
  • the display unit specifically includes the layer mixer, display Layer post-processing unit, display compressor and display communication interface, etc., and finally transmit the processed data to the display driver for display.
  • the layered processing cannot be performed through the display composition service, and the first data stream can be uniformly allocated to the first display unit , And deliver the processed data to the display driver for display.
  • FIG. 2a is a schematic flowchart of a video frame interpolation method provided by an embodiment of the present application. As shown in the figure, the video frame interpolation method is described from the perspective of an electronic device, and specifically includes the following steps.
  • the shown first data stream mainly includes data that needs to be displayed on the liquid crystal display at the end.
  • the first data stream may include only a single type of data, or may include multiple types of mixed data.
  • the first data The stream can include data transmitted through a data pipe provided for video YUV data, or data transmitted through a data pipe provided for ordinary RGB data.
  • S202 Determine whether first video data and user interface data exist in the first data stream at the same time.
  • the user interface may be an interface for operating functions such as video recording, screen locking, etc., for the progress of the video, for example, when the video is paused. , Play the next video, adjust the video definition and other operations interface.
  • the first video data and the user interface data need to be separated, so that the subsequent frame interpolation processing can be performed on the video data separately.
  • S204 Process the first video data and the user interface data separately to complete frame insertion processing on the first data stream.
  • the digital signal processor can be used to perform frame interpolation processing and user interface data decoding processing on the first video data respectively.
  • the video frame interpolation method can be based on the frame interpolation algorithm. A frame of image is added between adjacent frames to make the picture smoother during video playback.
  • the first video data after the processing can be sent to the LCD for display and output.
  • the electronic device first obtains the first data stream; then determines whether the first video data and the user interface data exist in the first data stream at the same time; if so, the first video data is separated And the user interface data; finally, the first video data and the user interface data are processed separately to complete the frame insertion processing of the first data stream.
  • the video frame interpolation method provided by the embodiments of the present application can solve the problem of abnormal color blocks in the user interface layer of the application when the video scene coexisting with the video and the user interface is inserted into the frame.
  • the determining whether the first video data and the user interface data exist at the same time in the first data stream includes: obtaining the color coding schemes of all the layers included in the first data stream;
  • the color coding scheme determines the layer type of the layer, and the layer type includes a video layer and a user interface layer;
  • the layer type in the first data stream is determined according to the layer type included in the first data stream Whether the first video data and user interface data exist at the same time.
  • the color coding scheme is a kind of visual information coding with color as the code, such as YCBCR coding scheme, YUV coding scheme and RGBA coding scheme, etc.
  • the color coding scheme adopted by different layer types is different, special, after analysis It is found that the layer with the color coding scheme YCBCR_420 is the video layer, and the layer with the color coding scheme RGBA8888 is the user interface layer.
  • YCBCR is a color coding scheme commonly used in consumer video products such as cameras and digital TVs.
  • Y refers to the brightness component
  • CB refers to the blue chrominance component
  • CR refers to the red chrominance component
  • YCBCR_420 means that every four pixels have four One luminance component and two chrominance components, namely YYYYCBCR.
  • R represents the red chroma component
  • G represents the green chroma component
  • B represents the blue chroma component
  • A represents the transparency (Alpha)
  • the alpha channel is generally used as the opacity parameter, if the alpha channel of a pixel
  • a value of 0% means that it is completely transparent, that is, invisible, and a value of 100% means a completely opaque pixel.
  • RGBA color coding is common when recording and displaying color images.
  • RGBA8888 means that there are four parameters, that is, each parameter of A, R, G, and B is represented by 8 bits, and a pixel has a total of 32 bits. If it is detected that the two coding schemes exist in the first data stream according to the characteristics of the color coding scheme, it means that the video layer and the user interface layer exist in the first data stream at the same time.
  • the type of layer in the data stream is determined according to the color coding scheme to determine whether video data and user interface data exist at the same time in the data stream. This method can quickly and accurately determine the data type, which is convenient Subsequent processing of video data.
  • the separating the first video data and the user interface data includes: allocating the first video data to a first display layer post-processing unit; determining that the first data stream is Data other than the first video data is a second data stream, and the second data stream includes the user interface data; the second data stream is allocated to a second display layer post-processing unit.
  • the video data in the data stream and the user interface need to be processed separately, and the video data can be allocated to the first display layer post-processing unit to allocate the user interface data To the second display layer post-processing unit for subsequent processing of the video data and user interface data.
  • These data can be modified and processed in the display layer post-processing unit such as enhancing the brightness value of the layer, and will be processed
  • the subsequent data are respectively transmitted to the digital signal processor through the first display serial interface and the second display serial interface.
  • the first data stream only includes video data and does not include user interface data at the same time, you can directly allocate the first data stream to the first display layer post-processing unit for operation and processing, and pass the processed data through The first display serial interface is transmitted to the digital signal processor. If the first data stream includes both video data and user interface data, but the video data and user interface data are layered together, the first data stream can also be distributed Go to the post-processing unit of the first display layer for operation and processing.
  • the video data and user interface data are respectively allocated to two display layer post-processing units for data processing, which can avoid the phenomenon of abnormal color blocks in the user interface layer when the video data is subsequently inserted into the frame.
  • the electronic device is in a single video application playing scene
  • performing frame interpolation processing on the first video data by a digital signal processor includes: determining the frame rate of the first video data and The screen refresh rate of the electronic device; the frame insertion scheme is determined according to the frame rate and the screen refresh rate; and the first video data is sent to a digital signal processor for frame insertion processing according to the frame insertion scheme.
  • the frame rate is the speed at which the screen changes. As long as the graphics card is sufficiently supported, the frame rate can be high. As long as the frame rate is high, the screen will be smooth. In theory, each frame is a different screen, such as 60 frames per second (fps) That is, the graphics card generates 60 pictures per second.
  • the screen refresh rate is the speed at which the graphics card refreshes the display signal output. For example, 60 Hz (hertz) means that the graphics card outputs signals to the display 60 times per second. Assuming that the number of frames is 1/2 of the refresh rate, it means that the graphics card outputs a picture to the display every two times.
  • the frame insertion scheme can be that the frame rate of the video after the frame is inserted is an integer multiple of the screen refresh rate, and/or the frame rate of the video after the frame is inserted is not higher than the screen refresh rate.
  • determining the frame insertion scheme according to the video frame rate and the screen refresh rate of the electronic device can ensure the quality of the video after the frame is inserted, improve the user's viewing experience, and can also save the resources of the digital signal processor.
  • the electronic device is in a split-screen multi-video application playing scene, and performing frame insertion processing on the first video data by a digital signal processor includes: determining that the first video data requires The second video data for frame insertion processing, where the second video data includes the video data of the first video to be played in the first split-screen area, and/or the second video to be played in the second split-screen area The video data; the second video data is sent to the digital signal processor for frame insertion processing.
  • the electronic device may only play one video, that is, there is only one video data in the first data stream, or the electronic device may be in both the first split-screen area and the second split-screen area at the same time.
  • the video is played, that is, there will be two video data in the first data stream at the same time.
  • both video data and user interface data exist in both split screen areas.
  • the electronic device when the electronic device is in a split-screen multi-video playback scene, after separating the video data and the user interface data, it is also necessary to determine which one or whether both video data need to be interpolated, and then interpolate the frame.
  • the second video data is sent to the digital signal processor for frame interpolation.
  • the second video data includes video data of the first video and the second video
  • the sending the second video data to a digital signal processor for frame insertion processing includes : Respectively determine the ratio of the moving image of the first video and the moving image of the second video; respectively determine the frame insertion rate of the first video and the second video according to the ratio of the moving image, the The frame insertion rate is positively related to the proportion of the moving image; the video data of the first video and the second video are respectively sent to the data signal processor for frame insertion processing according to the frame insertion rate.
  • the video in the first split screen area and the second split screen area need to be interpolated, you can first determine the current moving image of the first video in the first split screen area and the second split screen area If the ratio of the moving image of the first video is higher than the ratio of the moving image of the second video, when interpolating the first video and the second video, the current time period
  • the frame rate of the first video is higher than the frame rate of the second video, and the frame rate of the video increases as the proportion of the moving image increases.
  • the frame insertion rate is determined according to the ratio of the moving images of the first video and the second video, which can reduce the delay of video output and improve the frame insertion efficiency.
  • the determining the second video data that needs to be subjected to frame insertion processing in the first video data includes: acquiring the frame rate of the first video data; determining the frame rate according to the screen refresh rate
  • the first video data that does not reach the preset frame rate is the second video data that needs to be subjected to frame insertion processing.
  • the video data when determining whether the video in the first split screen area and the second split screen area needs to be inserted into the frame, it can be determined according to the frame rate of the first video and the second video and the screen refresh rate of the electronic device. If there is video data When the frame rate is lower than the screen refresh rate of the electronic device, and/or the frame rate of the video data is not an integer multiple of the screen refresh rate of the electronic device, the video data can be determined as the second video that requires frame interpolation data.
  • the second video data that needs to be inserted into the frame is determined according to the frame rate of each video and the screen refresh rate of the electronic device, which can ensure the video quality after the frame is inserted. Quality, improve the user's viewing experience, but also save the resources of the digital signal processor.
  • Figure 2b is a schematic diagram of a single video application scenario provided by an embodiment of the present application.
  • the electronic device uses a mobile phone as an example.
  • the mobile phone only plays one video and there is a user interface in the video.
  • Choose the frame insertion scheme based on the number and the screen refresh rate of the mobile phone. If the current playback method video is 30 frames per second and the screen refresh rate of the mobile phone is 120 Hz, you can choose to insert the entire video to 120 frames per second. Or insert the entire video frame to 60 frames per second, or you can choose the frame insertion scheme according to the specific video content. For example, there are only more moving images from the 15th minute to the 30th minute in the current playing video, you can Interpolation is only performed for these 15 minutes, and the entire video is not interpolated.
  • Figure 2c is a schematic diagram of a split-screen multi-video application scenario provided by an embodiment of the present application.
  • the electronic device uses a mobile phone as an example.
  • the frame interpolation rate can be determined according to the ratio of the moving images of the two videos currently playing. For example, in the current one minute, the first 50% of the first video in the split-screen area is motion images, and only 10% of the second video in the second split-screen area is motion images. It can be determined that the proportion of the motion image of the first video is one-sixth Five, and the second video’s moving image ratio is one-sixth. Therefore, most of the time in this minute is used to interpolate the first video, and a small part of the time is used to interpolate the second video. .
  • FIG. 3 is a schematic flowchart of another video frame interpolation method provided by an embodiment of the present application. As shown in the figure, this video frame interpolation method includes the following steps:
  • S303 Determine the layer type of the layer according to the color coding scheme, where the layer type includes a video layer and a user interface layer;
  • S304 Determine, according to the layer type included in the first data stream, whether first video data and user interface data exist at the same time in the first data stream;
  • S306 Process the first video data and the user interface data separately to complete frame insertion processing on the first data stream.
  • the electronic device first obtains the first data stream; then obtains the color coding schemes of all the layers included in the first data stream, and determines the layers according to the color coding scheme The layer type; then, according to the layer type included in the first data stream, it is determined whether the first video data and user interface data exist at the same time in the first data stream; if so, the first video data is separated And the user interface data; finally, the first video data and the user interface data are processed separately to complete the frame insertion processing of the first data stream.
  • the video frame interpolation method provided by the embodiment of the present application can solve the problem of abnormal color blocks in the user interface layer of the application when the video scene coexisting with the video and the user interface is inserted into the frame.
  • FIG. 4 is a schematic structural diagram of an electronic device 400 provided by an embodiment of the present application.
  • the electronic device 400 includes an application The processor 410, the memory 420, the communication interface 430, and one or more programs 421, wherein the one or more programs 421 are stored in the above-mentioned memory 420 and are configured to be executed by the above-mentioned application processor 410, and the one or more programs 421 are The or multiple programs 421 include instructions for executing any step in the foregoing method embodiments.
  • the program 421 includes instructions for performing the following steps: acquiring a first data stream; determining whether the first video data and user interface data exist at the same time in the first data stream; if so, then Separate the first video data and the user interface data; process the first video data and the user interface data separately to complete frame insertion processing on the first data stream.
  • the instructions in the program 421 are specifically used to perform the following operations: obtain the first data stream.
  • the color coding scheme of all layers included in the data stream; the layer type of the layer is determined according to the color coding scheme, and the layer type includes a video layer and a user interface layer; and the first data stream includes The layer type determines whether first video data and user interface data exist in the first data stream at the same time.
  • the instructions in the program 421 are specifically used to perform the following operations: allocating the first video data to the first display Layer post-processing unit; determining that data other than the first video data in the first data stream is a second data stream, and the second data stream includes the user interface data; allocating the second data Flow to the post-processing unit of the second display layer.
  • the electronic device is in a single video application playing scene.
  • the instructions in the program 421 are specifically used to perform the following operations : Determine the frame rate of the first video data and the screen refresh rate of the electronic device; determine the frame insertion scheme according to the frame rate and the screen refresh rate; convert the first video data according to the frame insertion scheme Sent to the digital signal processor for frame insertion processing.
  • the electronic device is in a split-screen multi-video application playback scenario.
  • the instructions in the program 421 are specifically used to execute The following operation: Determine the second video data that needs to be processed for frame insertion in the first video data, where the second video data includes the video data of the first video to be played in the first split-screen area, and/or Video data of the second video to be played in the second split screen area; sending the second video data to the digital signal processor for frame insertion processing.
  • the second video data includes video data of the first video and the second video, and in the aspect of sending the second video data to a digital signal processor for frame insertion processing
  • the instructions in the program 421 are specifically used to perform the following operations: respectively determine the proportions of the moving images of the first video and the second video; respectively determine the proportions of the first video according to the proportions of the moving images
  • the interpolated frame rate of the video and the second video, the interpolated frame rate is positively related to the proportion of the moving image; the video data of the first video and the second video are sent separately according to the interpolated frame rate To the data signal processor for frame insertion processing.
  • the instructions in the program 421 are specifically used to perform the following operations: acquiring the first video The frame rate of the data; it is determined according to the screen refresh rate that the first video data whose frame rate does not reach the preset frame rate is the second video data that needs to be subjected to frame insertion processing.
  • an electronic device includes hardware structures and/or software modules corresponding to each function.
  • this application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • the embodiment of the present application may divide the electronic device into functional units according to the foregoing method examples.
  • each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit. It should be noted that the division of units in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • FIG. 5 is a block diagram of functional units of a video frame interpolation device 500 provided by an embodiment of the present application.
  • the video frame insertion device 500 is applied to electronic equipment, and the device includes a processing unit 501 and a communication unit 502, wherein:
  • the processing unit 501 is used to obtain the first data stream through the communication unit 502; and used to determine whether the first video data and user interface data exist at the same time in the first data stream; if so, separate the first data stream. Video data and the user interface data; and used to process the first video data and the user interface data respectively to complete the frame insertion processing of the first data stream.
  • the processing unit 501 in determining whether the first video data and user interface data exist at the same time in the first data stream, is specifically configured to obtain information contained in the first data stream.
  • the color coding scheme of all layers; the layer type of the layer is determined according to the color coding scheme, and the layer type includes a video layer and a user interface layer; according to the layer contained in the first data stream The type determines whether the first video data and the user interface data exist at the same time in the first data stream.
  • the processing unit 501 is specifically configured to distribute the first video data to the first display layer post-processing unit Determining that other data in the first data stream except the first video data is a second data stream, and the second data stream includes the user interface data; assigning the second data stream to a second display Layer post-processing unit.
  • the electronic device is in a single video application playing scene.
  • the processing unit 501 is specifically configured to determine the first video data and the user interface data.
  • a frame rate of video data and the screen refresh rate of the electronic device determine a frame insertion scheme according to the frame rate and the screen refresh rate; send the first video data to digital signal processing according to the frame insertion scheme
  • the device performs frame insertion processing.
  • the electronic device is in a split-screen multi-video application playback scenario.
  • the processing unit 501 is specifically configured to determine The second video data in the first video data that needs to be interpolated, and the second video data includes the video data of the first video to be played in the first split-screen area, and/or in the second split-screen area The video data of the second video to be played within; sending the second video data to the digital signal processor for frame insertion processing.
  • the second video data includes video data of the first video and the second video, and in the aspect of sending the second video data to a digital signal processor for frame insertion processing ,
  • the processing unit 501 is specifically configured to determine the proportions of the moving image of the first video and the moving image of the second video respectively; and respectively determine the first video and the first video according to the proportion of the moving image.
  • Interpolation frame rate of the video is positively related to the proportion of the moving image; according to the interpolating frame rate, the video data of the first video and the second video are respectively sent to the data signal processor Do frame insertion processing.
  • the processing unit 501 in determining the second video data that needs to be subjected to frame insertion processing in the first video data, is specifically configured to obtain the frame rate of the first video data; It is determined according to the screen refresh rate that the first video data whose frame rate does not reach the preset frame rate is the second video data that needs to be subjected to frame insertion processing.
  • the video frame insertion apparatus 500 may further include a storage unit 503 for storing program codes and data of the electronic device.
  • the processing unit 501 may be a processor
  • the communication unit 502 may be a touch screen or a transceiver
  • the storage unit 503 may be a memory.
  • the embodiment of the present application also provides a chip, wherein the chip includes a processor, which is used to call and run a computer program from the memory, so that the device installed with the chip executes the method described in the electronic device in the above method embodiment. Part or all of the steps.
  • An embodiment of the present application also provides a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any method as recorded in the above method embodiment ,
  • the above-mentioned computer includes electronic equipment.
  • the embodiments of the present application also provide a computer program product.
  • the above-mentioned computer program product includes a non-transitory computer-readable storage medium storing a computer program. Part or all of the steps of the method.
  • the computer program product may be a software installation package, and the above-mentioned computer includes electronic equipment.
  • the disclosed device may be implemented in other ways.
  • the device embodiments described above are only illustrative, for example, the division of the above-mentioned units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components can be combined or integrated. To another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical or other forms.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the above integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable memory.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a memory.
  • a number of instructions are included to enable a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the foregoing methods of the various embodiments of the present application.
  • the aforementioned memory includes: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or optical disk and other media that can store program codes.
  • the program can be stored in a computer-readable memory, and the memory can include: a flash disk , Read-only memory (English: Read-Only Memory, abbreviation: ROM), random access device (English: Random Access Memory, abbreviation: RAM), magnetic disk or optical disc, etc.

Abstract

本申请实施例提供了一种视频插帧方法及相关装置,应用于电子设备,方法包括:获取第一数据流;确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。可见,本申请实施例提供的视频插帧方法及相关装置可以解决在对视频与用户界面共存的视频场景进行插帧时,应用程序的用户界面图层存在色块异常的问题。

Description

视频插帧方法及相关装置 技术领域
本申请涉及通信技术领域,具体涉及一种视频插帧方法及相关装置。
背景技术
目前常被视频压缩或视频编解码器用来减少视频序列中的空域冗余的方法是运动补偿(Motion Estimate and Motion Compensation,MEMC)视频插帧方法,运动补偿是一种描述相邻帧差别的方法,所述相邻表示在编码关系上相邻,在播放顺序上两帧未必相邻,具体来说是描述前面一帧的每个小块怎样移动到当前帧中的某个位置去,运动补偿也可以用来进行去交织与运动插值的操作。
发明内容
本申请实施例提供了一种视频插帧方法及相关装置,可以单独对视频场景中的视频数据进行插帧。
第一方面,本申请实施例提供了一种视频插帧方法,应用于电子设备,所述方法包括:
获取第一数据流;
确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;
若是,则分离所述第一视频数据和所述用户界面数据;
分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
第二方面,本申请实施例提供了一种视频插帧装置,应用于电子设备,所述装置包括处理单元和通信单元,其中,
所述处理单元用于通过所述通信单元获取第一数据流;以及用于确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;以及用于分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
第三方面,本申请实施例提供一种电子设备,包括处理器、存储器、通信接口以及一个或多个程序,其中,上述一个或多个程序被存储在上述存储器中,并且被配置由上述处理器执行,上述程序包括用于执行本申请实施例第一方面任一方法中的步骤的指令。
第四方面,本申请实施例提供了一种计算机可读存储介质,其中,上述计算机可读存储介质存储用于电子数据交换的计算机程序,其中,上述计算机程序使得计算机执行如本申请实施例第一方面任一方法中所描述的部分或全部步骤。
第五方面,本申请实施例提供了一种计算机程序产品,其中,上述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,上述计算机程序可操作来使计算机 执行如本申请实施例第一方面中所描述的部分或全部步骤。该计算机程序产品可以为一个软件安装包。
可以看出,本申请实施例中,电子设备首先获取第一数据流;然后确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;最后分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。可见,本申请实施例提供的视频插帧方法及相关装置可以解决在对视频与用户界面共存的视频场景进行插帧时,应用程序的用户界面图层存在色块异常的问题。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1a是本申请实施例提供的一种视频插帧方法的硬件框图;
图1b是本申请实施例提供的一种视频插帧方法的软件框图;
图2a是本申请实施例提供的一种视频插帧方法的流程示意图;
图2b是本申请实施例提供的单一视频应用场景示意图;
图2c是本申请实施例提供的分屏多视频应用场景示意图;
图3是本申请实施例提供的另一种视频插帧方法的流程示意图;
图4是本申请实施例提供的一种电子设备的结构示意图;
图5是本申请实施例提供的一种视频插帧装置的功能单元组成框图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其他步骤或单元。
在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式 地理解的是,本文所描述的实施例可以与其它实施例相结合。
目前,市场上现存的视频插帧算法,是针对的整个显示输出做插帧,当显示输出中存在用户界面(User Interface,UI)和视频时,UI受到插帧算法的影响,将会出现色块异常。
针对上诉问题,本申请实施例提供了一种视频插帧方法及相关装置,下面结合附图对本实施例进行详细介绍。
如图1a所示,图1a是本申请实施例提供的一种视频插帧方法的硬件框图。所示硬件框图100包括数据管道、以及依次连接的显示图层后处理单元、串行接口、数字信号处理器和液晶显示屏等,所述数据管道可以是专为传输视频YUV数据的数据管道,当数据流中同时存在视频数据和UI数据时,所述数据管道分别用于传输视频数据和UI数据,第一显示图层后处理单元用于接收处理视频数据,第二显示图层后处理单元用于接收处理UI数据,并分别将视频数据和UI数据通过第一显示串行接口和第二显示串行接口传输给数字信号处理器,通过数字信号处理器对视频数据和UI数据进行插帧,解码等处理,最后将处理好的数据传输给液晶显示屏输出。
如图1b所示,图1b是本申请实施例提供的一种视频插帧方法的软件框图,当电子设备获取的第一数据流中同时包括视频数据和用户界面数据,且其中视频层和用户界面层是图层分离的时,在所述第一数据流进行帧率检测、分层检测等前端策略处理后,可以通过显示合成服务识别不同图层的色彩编码方案等图层或数据类型特征,并将视频层和用户界面层分开,然后通过中间件的代码将视频数据分配给第一显示单元,将用户界面数据分配给第二显示单元,显示单元中具体包括了图层混合器、显示图层后处理单元、显示压缩器和显示通信接口等,最后将处理后的数据传输给显示驱动端进行显示。当然,若电子设备获取的第一数据流中包括的视频数据和用户界面数据是图层叠合的,则不能通过显示合成服务进行分层处理,可统一将第一数据流分配给第一显示单元,并将处理后的数据交给显示驱动端进行显示。
请参阅图2a,图2a是本申请实施例提供的一种视频插帧方法的流程示意图,如图所示,本视频插帧方法是从电子设备的角度进行描述,具体包括如下步骤。
S201,获取第一数据流。
其中,所示第一数据流中主要包括最后需要通过液晶显示屏进行显示的数据,第一数据流中可以仅包括单一类型的数据,也可以包括多种类型的混合数据,所述第一数据流可以包括通过专为视频YUV数据提供的数据管道传输的数据,也可以包括通过为普通RGB数据提供的数据管道传输的数据。
S202,确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
其中,在视频应用播放视频的过程中,可能同时存在视频数据和用户界面数据,该用户界面可以是针对视频的进度,对视频进行录制,屏幕锁定等功能进行操作的界面,例如对暂停播放视频、播放下一个视频、调解视频清晰度等操作的界面。在对数据类型进行判断时,可以通过显示合成服务进行特征识别,根据数据的特征区分不同的数据类型,并判断数据流中是否同时存在视频数据和用户界面数据。
S203,若是,则分离所述第一视频数据和所述用户界面数据。
其中,如果判断出第一数据流中同时存在第一视频数据和用户界面数据时,则需要将第一视频数据和用户界面数据分开,以便后续单独对视频数据进行插帧处理。
S204,分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
其中,将第一视频数据和用户界面数据分开后,可以通过数字信号处理器分别针对第一视频数据进行插帧处理和用户界面数据进行解码处理,视频插帧方法可以是根据插帧算法在相邻帧之间增加一帧图像,使得在视频播放时的画面更加流畅,最后可以将处理完成后的第一视频数据送往液晶显示器进行显示输出。
可以看出,本申请实施例中,电子设备首先获取第一数据流;然后确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;最后分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。可见,本申请实施例提供的视频插帧方法,可以解决在对视频与用户界面共存的视频场景进行插帧时,应用程序的用户界面图层存在色块异常的问题。
在一个可能的实例中,所述确定所述第一数据流中是否同时存在第一视频数据和用户界面数据,包括:获取所述第一数据流中包含的所有图层的色彩编码方案;根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
其中,色彩编码方案是一种以颜色作为代码的视觉信息编码,例如YCBCR编码方案,YUV编码方案和RGBA编码方案等,不同的图层类型采用的色彩编码方案是不同的,特别的,经分析发现,色彩编码方案为YCBCR_420的图层是视频层,色彩编码方案为RGBA8888的图层是用户界面层。其中YCBCR是在摄像机、数字电视等消费类视频产品中常用的色彩编码方案,其中Y是指亮度分量,CB指蓝色色度分量,而CR指红色色度分量,YCBCR_420表示每四个像素有四个亮度分量,2个色度分量,即YYYYCBCR。其中RGBA色彩编码方式中,R代表红色色度分量,G代表绿色色度分量,B代表蓝色色度分量,A代表透明度(Alpha),alpha通道一般用作不透明度参数,如果一个像素的alpha通道数值为0%,那它就是完全透明的,也就是看不见的,而数值为100%则意味着一个完全不透明的像素传统的数字图像,记录及显示彩色图像时,RGBA色彩编码方式是常见的一种方案,RGBA8888:意味着有四个参数,即A、R、G、B中的每一个参数由8比特(bit)来表示,一个像素点共32位。若根据色彩编码方案的特征检测到第一数据流中同时存在这两种编码方案,则意味着第一数据流中同时存在视频层和用户界面层。
可见,本示例中,根据色彩编码方案来确定数据流中的图层的类型,以此来确定数据流中是否同时存在视频数据和用户界面数据,这种方式可以快速准确地确定数据类型,方便后续对视频数据的处理。
在一个可能的实例中,所述分离所述第一视频数据和所述用户界面数据,包括:分配 所述第一视频数据到第一显示图层后处理单元;确定所述第一数据流中除所述第一视频数据外的其他数据为第二数据流,所述第二数据流包括所述用户界面数据;分配所述第二数据流到第二显示图层后处理单元。
其中,确定数据流中同时存在视频数据和用户界面后,需要将数据流中的视频数据和用户界面分开处理,可以将视频数据分配到第一显示图层后处理单元中,将用户界面数据分配到第二显示图层后处理单元中,以便后续分别对视频数据和用户界面数据进行处理,这些数据可以分别在显示图层后处理单元中进行增强图层亮度值等修改处理操作,并将处理后的数据分别通过第一显示串行接口和第二显示串行接口传输给数字信号处理器。当然,若第一数据流中仅包括视频数据,不同时包括用户界面数据时,可以直接将第一数据流分配到第一显示图层后处理单元中进行操作处理,并将处理后的数据通过第一显示串行接口传输给数字信号处理器,若第一数据流中同时包括视频数据和用户界面数据,但视频数据和用户界面数据是图层叠合的,则也可以将第一数据流分配到第一显示图层后处理单元中进行操作处理。
可见,本示例中,将视频数据和用户界面数据分别分配给两个显示图层后处理单元进行数据处理,可以避免后续对视频数据进行插帧处理时用户界面图层出现色块异常的现象。
在一个可能的实例中,所述电子设备处于单一视频应用播放场景,所述通过数字信号处理器针对所述第一视频数据进行插帧处理,包括:确定所述第一视频数据的帧率和所述电子设备的屏幕刷新率;根据所述帧率和所述屏幕刷新率确定插帧方案;根据所述插帧方案将所述第一视频数据送往数字信号处理器进行插帧处理。
其中,帧率就是画面改变的速度,只要显卡足够支撑,帧率就能很高,只要帧率高画面就流畅,理论上,每一帧都是不同的画面,例如60帧每秒(fps)就是每秒钟显卡生成60张画面图片。屏幕刷新率就是显卡将显示信号输出刷新的速度,例如60赫兹(hertz)就是每秒钟显卡向显示器输出60次信号。假设帧数是刷新率的1/2,那么意思就是显卡每两次向显示器输出的画面是用一幅画面,相反,如果帧数是刷新率的2倍,那么画面每改变两次,其中只有1次是被显卡发送并在显示器上显示的,所以高于刷新率的帧数都是无效帧数,对画面效果没有任何提升,反而可能导致画面异常,因此,根据帧率和屏幕刷新率确定的插帧方案,可以是插帧后的视频的帧率是屏幕刷新率的整数倍,和/或插帧后的视频的帧率不高于屏幕刷新率。
其中,在单一视频应用场景中,仅有一个视频数据需要进行插帧处理,因此还可以根据视频质量和用户需求以及电子设备的硬件能力确定不同的插帧方案,例如针对整个视频都进行插帧处理,每间隔几帧就做一次插帧,或者仅针对视频数据中的某一个时间段进行插帧。
可见,本示例中,根据视频帧率和电子设备的屏幕刷新率来确定插帧方案,可以确保插帧后视频的质量,提高用户观赏体验,还可以节约数字信号处理器的资源。
在一个可能的实例中,所述电子设备处于分屏多视频应用播放场景,所述通过数字信 号处理器针对所述第一视频数据进行插帧处理,包括:确定所述第一视频数据中需要进行插帧处理的第二视频数据,所述第二视频数据包括在第一分屏区域内待播放的第一视频的视频数据,和/或在第二分屏区域内待播放的第二视频的视频数据;将所述第二视频数据送往数字信号处理器做插帧处理。
其中,分屏多视频应用播放场景中,可能电子设备仅播放了一个视频,即第一数据流中仅有一个视频数据,也可能电子设备同时在第一分屏区域和第二分屏区域都播放了视频,即第一数据流中会同时存在两个视频数据。当第一数据流中存在两个视频数据时,可能仅有一个分屏区域内即存在视频数据又存在用户界面数据,当然也可能两个分屏区域中同时都存在视频数据和用户界面数据。因此,当电子设备处于分屏多视频播放场景时,将视频数据和用户界面数据分离后,还需要确定哪一个或者是否两个视频数据都需要进行插帧处理,再将需要进行插帧处理的第二视频数据送往数字信号处理器进行插帧。
可见,本示例中,当电子设备处于分屏多视频应用场景中时,先确定需要进行插帧处理的第二视频数据,再将第二视频数据送往数字信号处理器进行插帧,这样不仅可以提高视频数据处理效率还能节约数字信号处理器的资源。
在一个可能的实例中,所述第二视频数据包括所述第一视频和所述第二视频的视频数据,所述将所述第二视频数据送往数字信号处理器做插帧处理,包括:分别确定所述第一视频的运动影像和所述第二视频的运动影像的比例;根据所述运动影像的比例分别确定所述第一视频和所述第二视频的插帧率,所述插帧率与所述运动影像的比例正关联;根据所述插帧率分别将所述第一视频和所述第二视频的视频数据送往数据信号处理器做插帧处理。
其中,当第一分屏区域和第二分屏区域内的视频都需要进行插帧处理时,可以先确定当前的第一分屏区域内的第一视频的运动影像和第二分屏区域内的第二视频的运动影像的比例,若第一视频的运动影像的比例高于第二视频的运动影像,则在对第一视频和第二视频进行插帧时,在当前的时间段内对第一视频的插帧率要高于对第二视频从插帧率,视频的插帧率是随着运动影像的比例增加而增加的。
可见,本示例中,根据第一视频和第二视频的运动影像的比例来确定插帧率,这样可以降低视频输出的时延,提高插帧效率。
在一个可能的实例中,所述确定所述第一视频数据中需要进行插帧处理的第二视频数据,包括:获取所述第一视频数据的帧率;根据屏幕刷新率确定所述帧率未达到预设帧率的所述第一视频数据为需要进行插帧处理的第二视频数据。
其中,在确定第一分屏区域和第二分屏区域内的视频是否需要插帧时,可以根据第一视频和第二视频的帧率和电子设备的屏幕刷新率来确定,若存在视频数据的帧率低于电子设备的屏幕刷新率,和/或视频数据的帧率不是电子设备的屏幕刷新率的整数倍的时候,则可以将该视频数据确定为需要进行插帧处理的第二视频数据。
可见,本示例中,在分屏多视频应用播放场景中,根据每个视频的帧率和电子设备的屏幕刷新率来确定需要进行插帧处理的第二视频数据,可以确保插帧后视频的质量,提高 用户观赏体验,还可以节约数字信号处理器的资源。
下面结合具体实施例进行详细说明。
如图2b所示,图2b是本申请实施例提供的单一视频应用场景示意图,电子设备以手机为例,手机仅播放一个视频且该视频中同时存在用户界面,则根据该播放的视频的帧数和手机的屏幕刷新率来进行插帧方案的选择,若当前播放法视频为30帧每秒,而手机的屏幕刷新率为120赫兹,则可以选择将整个视频插帧到120帧每秒,或者将整个视频插帧到60帧每秒,也可以根据具体的视频内容进行插帧方案的选择,例如当前播放的视频中仅有第15分钟到第30分钟内的运动影像比较多,则可以仅针对这15分钟进行插帧,而不对整个视频进行插帧处理。
如图2c所示,图2c是本申请实施例提供的分屏多视频应用场景示意图,电子设备以手机为例,当手机在处于分屏场景中时,同时还播放了两个都需要进行插帧处理的视频时,这时为了降低视频插帧后输出时的时延问题,可以根据当前播放的两个视频的运动影像的比例来确定插帧率,例如在当前的一分钟内,第一分屏区域的第一视频中有50%是运动的影像,而第二分屏区域的第二视频中仅有10%是运动的影像,则可以确定第一视频的运动影像比例是六分之五,而第二视频的运动影像比例是六分之一,因此,在这一分钟内将大部分时间用于对第一视频进行插帧,而少部分时间用于对第二视频进行插帧。
请参阅图3,图3是本申请实施例提供的另一种视频插帧方法的流程示意图,如图所示,本视频插帧方法包括如下步骤:
S301,获取第一数据流;
S302,获取所述第一数据流中包含的所有图层的色彩编码方案;
S303,根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;
S304,根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;
S305,若是,则分离所述第一视频数据和所述用户界面数据;
S306,分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
可以看出,本申请实施例中,电子设备首先获取第一数据流;然后获取所述第一数据流中包含的所有图层的色彩编码方案,并根据所述色彩编码方案确定所述图层的图层类型;然后根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;最后分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。可见,本申请实施例提供的视频插帧方法可以解决在对视频与用户界面共存的视频场景进行插帧时,应用程序的用户界面图层存在色块异常的问题。
与上述图2a、图3所示的实施例一致的,请参阅图4,图4是本申请实施例提供的一种电子设备400的结构示意图,如图所示,所述电子设备400包括应用处理器410、存储 器420、通信接口430以及一个或多个程序421,其中,所述一个或多个程序421被存储在上述存储器420中,并且被配置由上述应用处理器410执行,所述一个或多个程序421包括用于执行上述方法实施例中任一步骤的指令。
在一个可能的示例中,所述程序421中包括用于执行以下步骤的指令:获取第一数据流;确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
在一个可能的示例中,在所述确定所述第一数据流中是否同时存在第一视频数据和用户界面数据方面,所述程序421中的指令具体用于执行以下操作:获取所述第一数据流中包含的所有图层的色彩编码方案;根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
在一个可能的示例中,在所述分离所述第一视频数据和所述用户界面数据方面,所述程序421中的指令具体用于执行以下操作:分配所述第一视频数据到第一显示图层后处理单元;确定所述第一数据流中除所述第一视频数据外的其他数据为第二数据流,所述第二数据流包括所述用户界面数据;分配所述第二数据流到第二显示图层后处理单元。
在一个可能的示例中,所述电子设备处于单一视频应用播放场景,在所述分别处理所述第一视频数据和所述用户界面数据方面,所述程序421中的指令具体用于执行以下操作:确定所述第一视频数据的帧率和所述电子设备的屏幕刷新率;根据所述帧率和所述屏幕刷新率确定插帧方案;根据所述插帧方案将所述第一视频数据送往数字信号处理器进行插帧处理。
在一个可能的示例中,所述电子设备处于分屏多视频应用播放场景,在所述分别处理所述第一视频数据和所述用户界面数据方面,所述程序421中的指令具体用于执行以下操作:确定所述第一视频数据中需要进行插帧处理的第二视频数据,所述第二视频数据包括在第一分屏区域内待播放的第一视频的视频数据,和/或在第二分屏区域内待播放的第二视频的视频数据;将所述第二视频数据送往数字信号处理器做插帧处理。
在一个可能的示例中,所述第二视频数据包括所述第一视频和所述第二视频的视频数据,在所述将所述第二视频数据送往数字信号处理器做插帧处理方面,所述程序421中的指令具体用于执行以下操作:分别确定所述第一视频的运动影像和所述第二视频的运动影像的比例;根据所述运动影像的比例分别确定所述第一视频和所述第二视频的插帧率,所述插帧率与所述运动影像的比例正关联;根据所述插帧率分别将所述第一视频和所述第二视频的视频数据送往数据信号处理器做插帧处理。
在一个可能的示例中,在所述确定所述第一视频数据中需要进行插帧处理的第二视频数据方面,所述程序421中的指令具体用于执行以下操作:获取所述第一视频数据的帧率;根据屏幕刷新率确定所述帧率未达到预设帧率的所述第一视频数据为需要进行插帧处理的第二视频数据。
上述主要从方法侧执行过程的角度对本申请实施例的方案进行了介绍。可以理解的是,电子设备为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所提供的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例对电子设备进行功能单元的划分,例如,可以对应各个功能划分各个功能单元,也可以将两个或两个以上的功能集成在一个处理单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。需要说明的是,本申请实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
图5是本申请实施例提供的一种视频插帧装置500的功能单元组成框图。该视频插帧装置500应用于电子设备,所述装置包括处理单元501和通信单元502,其中,
所述处理单元501用于通过所述通信单元502获取第一数据流;以及用于确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;以及用于分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
在一个可能的示例中,在所述确定所述第一数据流中是否同时存在第一视频数据和用户界面数据方面,所述处理单元501具体用于,获取所述第一数据流中包含的所有图层的色彩编码方案;根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
在一个可能的示例中,在所述分离所述第一视频数据和所述用户界面数据方面,所述处理单元501具体用于,分配所述第一视频数据到第一显示图层后处理单元;确定所述第一数据流中除所述第一视频数据外的其他数据为第二数据流,所述第二数据流包括所述用户界面数据;分配所述第二数据流到第二显示图层后处理单元。
在一个可能的示例中,所述电子设备处于单一视频应用播放场景,在所述分别处理所述第一视频数据和所述用户界面数据方面,所述处理单元501具体用于,确定所述第一视频数据的帧率和所述电子设备的屏幕刷新率;根据所述帧率和所述屏幕刷新率确定插帧方案;根据所述插帧方案将所述第一视频数据送往数字信号处理器进行插帧处理。
在一个可能的示例中,所述电子设备处于分屏多视频应用播放场景,在所述分别处理所述第一视频数据和所述用户界面数据方面,所述处理单元501具体用于,确定所述第一视频数据中需要进行插帧处理的第二视频数据,所述第二视频数据包括在第一分屏区域内待播放的第一视频的视频数据,和/或在第二分屏区域内待播放的第二视频的视频数据;将所述第二视频数据送往数字信号处理器做插帧处理。
在一个可能的示例中,所述第二视频数据包括所述第一视频和所述第二视频的视频数据,在所述将所述第二视频数据送往数字信号处理器做插帧处理方面,所述处理单元501具体用于,分别确定所述第一视频的运动影像和所述第二视频的运动影像的比例;根据所述运动影像的比例分别确定所述第一视频和所述第二视频的插帧率,所述插帧率与所述运动影像的比例正关联;根据所述插帧率分别将所述第一视频和所述第二视频的视频数据送往数据信号处理器做插帧处理。
在一个可能的示例中,在所述确定所述第一视频数据中需要进行插帧处理的第二视频数据方面,所述处理单元501具体用于,获取所述第一视频数据的帧率;根据屏幕刷新率确定所述帧率未达到预设帧率的所述第一视频数据为需要进行插帧处理的第二视频数据。
其中,所述视频插帧装置500还可以包括存储单元503,用于存储电子设备的程序代码和数据。所述处理单元501可以是处理器,所述通信单元502可以是触控显示屏或者收发器,存储单元503可以是存储器。
可以理解的是,由于方法实施例与装置实施例为相同技术构思的不同呈现形式,因此,本申请中方法实施例部分的内容应同步适配于装置实施例部分,此处不再赘述。
本申请实施例还提供了一种芯片,其中,该芯片包括处理器,用于从存储器中调用并运行计算机程序,使得安装有所述芯片的设备执行如上述方法实施例中电子设备所描述的部分或全部步骤。
本申请实施例还提供一种计算机存储介质,其中,该计算机存储介质存储用于电子数据交换的计算机程序,该计算机程序使得计算机执行如上述方法实施例中记载的任一方法的部分或全部步骤,上述计算机包括电子设备。
本申请实施例还提供一种计算机程序产品,上述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,上述计算机程序可操作来使计算机执行如上述方法实施例中记载的任一方法的部分或全部步骤。该计算机程序产品可以为一个软件安装包,上述计算机包括电子设备。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置,可通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如上述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连 接,可以是电性或其它的形式。
上述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
上述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储器中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储器中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例上述方法的全部或部分步骤。而前述的存储器包括:U盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储器中,存储器可以包括:闪存盘、只读存储器(英文:Read-Only Memory,简称:ROM)、随机存取器(英文:Random Access Memory,简称:RAM)、磁盘或光盘等。
以上对本申请实施例进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的一般技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种视频插帧方法,其特征在于,应用于电子设备,所述方法包括:
    获取第一数据流;
    确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;
    若是,则分离所述第一视频数据和所述用户界面数据;
    分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
  2. 根据权利要求1所述的方法,其特征在于,所述确定所述第一数据流中是否同时存在第一视频数据和用户界面数据,包括:
    获取所述第一数据流中包含的所有图层的色彩编码方案;
    根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;
    根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
  3. 根据权利要求2所述的方法,其特征在于,所述分离所述第一视频数据和所述用户界面数据,包括:
    分配所述第一视频数据到第一显示图层后处理单元;
    确定所述第一数据流中除所述第一视频数据外的其他数据为第二数据流,所述第二数据流包括所述用户界面数据;
    分配所述第二数据流到第二显示图层后处理单元。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述分别处理所述第一视频数据和所述用户界面数据,包括:
    通过数字信号处理器针对所述第一视频数据进行插帧处理;
    通过数字信号处理器针对所述用户界面数据进行解码处理。
  5. 根据权利要求4所述的方法,其特征在于,所述电子设备处于单一视频应用播放场景,所述通过数字信号处理器针对所述第一视频数据进行插帧处理,包括:
    确定所述第一视频数据的帧率和所述电子设备的屏幕刷新率;
    根据所述帧率和所述屏幕刷新率确定插帧方案;
    根据所述插帧方案将所述第一视频数据送往数字信号处理器进行插帧处理。
  6. 根据权利要求5所述的方法,其特征在于,所述插帧方案包括以下任意一种:
    对整个视频都进行插帧处理;
    每间隔几帧就做一次插帧;以及,
    仅针对所述第一视频数据中的某一个时间段进行插帧。
  7. 根据权利要求4所述的方法,其特征在于,所述电子设备处于分屏多视频应用播放场景,所述通过数字信号处理器针对所述第一视频数据进行插帧处理,包括:
    确定所述第一视频数据中需要进行插帧处理的第二视频数据,所述第二视频数据包括 在第一分屏区域内待播放的第一视频的视频数据,和/或在第二分屏区域内待播放的第二视频的视频数据;
    将所述第二视频数据送往数字信号处理器做插帧处理。
  8. 根据权利要求7所述的方法,其特征在于,所述第二视频数据包括所述第一视频和所述第二视频的视频数据,所述将所述第二视频数据送往数字信号处理器做插帧处理,包括:
    分别确定所述第一视频的运动影像和所述第二视频的运动影像的比例;
    根据所述运动影像的比例分别确定所述第一视频和所述第二视频的插帧率,所述插帧率与所述运动影像的比例正关联;
    根据所述插帧率分别将所述第一视频和所述第二视频的视频数据送往数据信号处理器做插帧处理。
  9. 根据权利要求7所述的方法,其特征在于,所述确定所述第一视频数据中需要进行插帧处理的第二视频数据,包括:
    获取所述第一视频数据的帧率;
    根据屏幕刷新率确定所述帧率未达到预设帧率的所述第一视频数据为需要进行插帧处理的第二视频数据。
  10. 一种视频插帧装置,其特征在于,应用于电子设备,所述装置包括处理单元和通信单元,其中,
    所述处理单元用于通过所述通信单元获取第一数据流;以及用于确定所述第一数据流中是否同时存在第一视频数据和用户界面数据;若是,则分离所述第一视频数据和所述用户界面数据;以及用于分别处理所述第一视频数据和所述用户界面数据,以完成对所述第一数据流的插帧处理。
  11. 根据权利要求10所述的装置,其特征在于,在所述确定所述第一数据流中是否同时存在第一视频数据和用户界面数据方面,所述处理单元具体用于:获取所述第一数据流中包含的所有图层的色彩编码方案;以及根据所述色彩编码方案确定所述图层的图层类型,所述图层类型包括视频层和用户界面层;以及根据所述第一数据流包含的所述图层类型确定所述第一数据流中是否同时存在第一视频数据和用户界面数据。
  12. 根据权利要求11所述的装置,其特征在于,在所述分离所述第一视频数据和所述用户界面数据方面,所述处理单元具体用于:分配所述第一视频数据到第一显示图层后处理单元;以及确定所述第一数据流中除所述第一视频数据外的其他数据为第二数据流,所述第二数据流包括所述用户界面数据;以及分配所述第二数据流到第二显示图层后处理单元。
  13. 根据权利要求10-12任一项所述的装置,其特征在于,在所述分别处理所述第一视频数据和所述用户界面数据方面,所述处理单元具体用于:通过数字信号处理器针对所述第一视频数据进行插帧处理;以及通过数字信号处理器针对所述用户界面数据进行解码处理。
  14. 根据权利要求13所述的装置,其特征在于,所述电子设备处于单一视频应用播放场景,在所述通过数字信号处理器针对所述第一视频数据进行插帧处理方面,所述处理单元具体用于:确定所述第一视频数据的帧率和所述电子设备的屏幕刷新率;以及根据所述帧率和所述屏幕刷新率确定插帧方案;以及根据所述插帧方案将所述第一视频数据送往数字信号处理器进行插帧处理。
  15. 根据权利要求14所述的装置,其特征在于,所述插帧方案包括以下任意一种:
    对整个视频都进行插帧处理;
    每间隔几帧就做一次插帧;以及,
    仅针对所述第一视频数据中的某一个时间段进行插帧。
  16. 根据权利要求13所述的装置,其特征在于,所述电子设备处于分屏多视频应用播放场景,在所述通过数字信号处理器针对所述第一视频数据进行插帧处理方面,所述处理单元具体用于:确定所述第一视频数据中需要进行插帧处理的第二视频数据,所述第二视频数据包括在第一分屏区域内待播放的第一视频的视频数据,和/或在第二分屏区域内待播放的第二视频的视频数据;以及将所述第二视频数据送往数字信号处理器做插帧处理。
  17. 根据权利要求16所述的装置,其特征在于,所述第二视频数据包括所述第一视频和所述第二视频的视频数据,在所述将所述第二视频数据送往数字信号处理器做插帧处理方面,所述处理单元具体用于:分别确定所述第一视频的运动影像和所述第二视频的运动影像的比例;以及根据所述运动影像的比例分别确定所述第一视频和所述第二视频的插帧率,所述插帧率与所述运动影像的比例正关联;以及根据所述插帧率分别将所述第一视频和所述第二视频的视频数据送往数据信号处理器做插帧处理。
  18. 根据权利要求16所述的装置,其特征在于,在所述确定所述第一视频数据中需要进行插帧处理的第二视频数据方面,所述处理单元具体用于:获取所述第一视频数据的帧率;以及根据屏幕刷新率确定所述帧率未达到预设帧率的所述第一视频数据为需要进行插帧处理的第二视频数据。
  19. 一种电子设备,其特征在于,包括处理器、存储器,以及一个或多个程序,所述一个或多个程序被存储在所述存储器中,并且被配置由所述处理器执行,所述程序包括用于执行如权利要求1-9任一项所述的方法中的步骤的指令。
  20. 一种计算机可读存储介质,其特征在于,存储用于电子数据交换的计算机程序,其中,所述计算机程序使得计算机执行如权利要求1-9任一项所述的方法。
PCT/CN2021/073974 2020-03-05 2021-01-27 视频插帧方法及相关装置 WO2021175049A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010148399.3 2020-03-05
CN202010148399.3A CN111327959A (zh) 2020-03-05 2020-03-05 视频插帧方法及相关装置

Publications (1)

Publication Number Publication Date
WO2021175049A1 true WO2021175049A1 (zh) 2021-09-10

Family

ID=71171554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/073974 WO2021175049A1 (zh) 2020-03-05 2021-01-27 视频插帧方法及相关装置

Country Status (2)

Country Link
CN (1) CN111327959A (zh)
WO (1) WO2021175049A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113766275A (zh) * 2021-09-29 2021-12-07 北京达佳互联信息技术有限公司 视频剪辑方法、装置、终端及存储介质

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111327959A (zh) * 2020-03-05 2020-06-23 Oppo广东移动通信有限公司 视频插帧方法及相关装置
CN112565865A (zh) * 2020-11-30 2021-03-26 维沃移动通信有限公司 图像处理方法、装置及电子设备
CN112633236A (zh) * 2020-12-31 2021-04-09 深圳追一科技有限公司 图像处理方法、装置、电子设备及存储介质
CN113141537A (zh) * 2021-04-02 2021-07-20 Oppo广东移动通信有限公司 视频插帧方法、装置、存储介质以及终端
CN113835657A (zh) * 2021-09-08 2021-12-24 维沃移动通信有限公司 显示方法及电子设备
CN113852776A (zh) * 2021-09-08 2021-12-28 维沃移动通信有限公司 插帧方法及电子设备
CN113835656A (zh) * 2021-09-08 2021-12-24 维沃移动通信有限公司 显示方法、装置及电子设备
CN114338953A (zh) * 2021-12-28 2022-04-12 维沃移动通信有限公司 视频处理电路、视频处理方法和电子设备
CN114302209A (zh) * 2021-12-28 2022-04-08 维沃移动通信有限公司 视频处理方法、装置、电子设备及介质
CN114339313A (zh) * 2021-12-28 2022-04-12 维沃移动通信有限公司 插帧方法、装置及电子设备
CN114339411B (zh) * 2021-12-30 2023-12-26 西安紫光展锐科技有限公司 视频处理方法、装置及设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135847A (ja) * 2007-12-03 2009-06-18 Hitachi Ltd 映像処理装置及びフレームレート変換方法
CN102833518A (zh) * 2011-06-13 2012-12-19 华为终端有限公司 一种mcu多画面优化配置的方法及装置
CN203313319U (zh) * 2013-06-09 2013-11-27 深圳创维-Rgb电子有限公司 一种显示系统
CN106933328A (zh) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 一种移动终端帧率的控制方法、装置及移动终端
US9703446B2 (en) * 2014-02-28 2017-07-11 Prezi, Inc. Zooming user interface frames embedded image frame sequence
CN109275011A (zh) * 2018-09-03 2019-01-25 青岛海信传媒网络技术有限公司 智能电视运动模式切换的处理方法及装置、用户设备
CN109803175A (zh) * 2019-03-12 2019-05-24 京东方科技集团股份有限公司 视频处理方法及装置、设备、存储介质
CN111327959A (zh) * 2020-03-05 2020-06-23 Oppo广东移动通信有限公司 视频插帧方法及相关装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5219646B2 (ja) * 2008-06-24 2013-06-26 キヤノン株式会社 映像処理装置及び映像処理装置の制御方法
US9473826B2 (en) * 2010-04-29 2016-10-18 Vdopia Inc. Method and apparatus for insertion of advertising in a live video stream
CN102685437B (zh) * 2012-02-03 2016-06-29 深圳市创维群欣安防科技股份有限公司 视频图像补偿方法及监视器
CN110086905B (zh) * 2018-03-26 2020-08-21 华为技术有限公司 一种录像方法及电子设备
CN108810281B (zh) * 2018-06-22 2020-12-11 Oppo广东移动通信有限公司 丢帧补偿方法、装置、存储介质及终端

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009135847A (ja) * 2007-12-03 2009-06-18 Hitachi Ltd 映像処理装置及びフレームレート変換方法
CN102833518A (zh) * 2011-06-13 2012-12-19 华为终端有限公司 一种mcu多画面优化配置的方法及装置
CN203313319U (zh) * 2013-06-09 2013-11-27 深圳创维-Rgb电子有限公司 一种显示系统
US9703446B2 (en) * 2014-02-28 2017-07-11 Prezi, Inc. Zooming user interface frames embedded image frame sequence
CN106933328A (zh) * 2017-03-10 2017-07-07 广东欧珀移动通信有限公司 一种移动终端帧率的控制方法、装置及移动终端
CN109275011A (zh) * 2018-09-03 2019-01-25 青岛海信传媒网络技术有限公司 智能电视运动模式切换的处理方法及装置、用户设备
CN109803175A (zh) * 2019-03-12 2019-05-24 京东方科技集团股份有限公司 视频处理方法及装置、设备、存储介质
CN111327959A (zh) * 2020-03-05 2020-06-23 Oppo广东移动通信有限公司 视频插帧方法及相关装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113766275A (zh) * 2021-09-29 2021-12-07 北京达佳互联信息技术有限公司 视频剪辑方法、装置、终端及存储介质
CN113766275B (zh) * 2021-09-29 2023-05-30 北京达佳互联信息技术有限公司 视频剪辑方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN111327959A (zh) 2020-06-23

Similar Documents

Publication Publication Date Title
WO2021175049A1 (zh) 视频插帧方法及相关装置
CN103841389B (zh) 一种视频播放方法及播放器
US9948884B2 (en) Converting method and converting apparatus for converting luminance value of an input video into a second luminance value
CN112822537B (zh) 使视频内容适应到显示特性的方法、设备及介质
US10402953B2 (en) Display method and display device
US10511803B2 (en) Video signal transmission method and device
WO2020062744A1 (zh) 视频水印添加方法以及装置、电子设备及存储介质
CN108235055B (zh) Ar场景中透明视频实现方法及设备
WO2021004176A1 (zh) 一种图像处理的方法及装置
CN105892976A (zh) 实现多屏互动的方法及装置
CN102497388A (zh) 一种移动网络终端以及该终端与电视进行无线传屏的方法
WO2020107999A1 (zh) 一种图像数据的传输方法及智能电视
WO2018000676A1 (zh) 配置图像模式的方法及装置
WO2023035882A9 (zh) 视频处理方法、设备、存储介质和程序产品
CN112511896A (zh) 一种视频渲染方法及装置
CN104717509A (zh) 一种视频解码方法及装置
KR20060022419A (ko) 그래픽 데이터 생성 장치, 방법 및 정보 저장 매체
TW200423744A (en) Apparatus and method for signal processing of format conversion and combination of video signals
US9094712B2 (en) Video processing device, display device and video processing method
WO2021217428A1 (zh) 图像处理方法、装置、摄像设备和存储介质
JP2009038685A (ja) 画像信号出力装置及び画像信号出力方法
JP2009038682A (ja) 画像処理装置及び画像処理方法
CN112738427B (zh) 一种sm768多路视频自适应输出方法
JP2002152660A (ja) 映像再生装置及びその方法
WO2017032116A1 (zh) 音视频播放设备和音视频播放方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21763669

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21763669

Country of ref document: EP

Kind code of ref document: A1