WO2020108099A1 - 视频处理方法、装置、电子设备和计算机可读介质 - Google Patents

视频处理方法、装置、电子设备和计算机可读介质 Download PDF

Info

Publication number
WO2020108099A1
WO2020108099A1 PCT/CN2019/110000 CN2019110000W WO2020108099A1 WO 2020108099 A1 WO2020108099 A1 WO 2020108099A1 CN 2019110000 W CN2019110000 W CN 2019110000W WO 2020108099 A1 WO2020108099 A1 WO 2020108099A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
played
type
video file
application
Prior art date
Application number
PCT/CN2019/110000
Other languages
English (en)
French (fr)
Inventor
杨海
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP19891124.0A priority Critical patent/EP3886445A4/en
Publication of WO2020108099A1 publication Critical patent/WO2020108099A1/zh
Priority to US17/331,480 priority patent/US11457272B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440263Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42653Internal components of the client ; Characteristics thereof for processing graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/258Client or end-user data management, e.g. managing client capabilities, user preferences or demographics, processing of multiple end-users preferences to derive collaborative data
    • H04N21/25808Management of client data
    • H04N21/25858Management of client data involving client software characteristics, e.g. OS identifier
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/426Internal components of the client ; Characteristics thereof
    • H04N21/42607Internal components of the client ; Characteristics thereof for processing the incoming bitstream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4341Demultiplexing of audio and video streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44004Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving video buffer management, e.g. video decoder buffer or video display buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44204Monitoring of content usage, e.g. the number of times a movie has been viewed, copied or the amount which has been watched
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/4424Monitoring of the internal components or processes of the client device, e.g. CPU or memory load, processing speed, timer, counter or percentage of the hard disk space used
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/458Scheduling content for creating a personalised stream, e.g. by combining a locally stored advertisement with an incoming stream; Updating operations, e.g. for OS modules ; time-related management operations

Definitions

  • the present application relates to the technical field of video processing, and more specifically, to a video processing method, device, electronic device, and computer-readable medium.
  • the device needs to perform operations such as decoding, rendering, and compositing on the video, and then display it on the display screen.
  • operations such as decoding, rendering, and compositing on the video
  • the graphics processor processes the image, it can reduce the load of the central processor. Pressure, however, it takes up a lot of memory, so it is important to choose a processor to process video properly.
  • This application proposes a video processing method, device, electronic device, and computer-readable medium to improve the above-mentioned defects.
  • an embodiment of the present application provides a video processing method, which is applied to a central processor of an electronic device, the electronic device further includes a screen and a graphics processor, and the method includes: acquiring a video type of a video file to be played Determine whether the video type meets the specified type; if it meets the specified type, control the graphics processor to process the video file to be played and display it on the screen.
  • an embodiment of the present application further provides a video processing apparatus, which is applied to a central processor of an electronic device, and the electronic device further includes a screen and a graphics processor.
  • the video processing device includes: an acquisition unit, a judgment unit, and a processing unit.
  • the obtaining unit is used to obtain the video type of the video file to be played.
  • the judging unit is used to judge whether the video type conforms to the specified type.
  • the processing unit is configured to control the graphics processor to process the video file to be played and display it on the screen if the specified type is met.
  • an embodiment of the present application further provides an electronic device, including: a central processor and a graphics processor; a memory; a screen; one or more application programs, wherein the one or more application programs are stored in the The memory is configured to be executed by the central processor, and the one or more programs are configured to perform the above method.
  • an embodiment of the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium stores program code, and the program code can be called by a processor to execute the above method.
  • the solution provided in this application after obtaining the video file to be played, will select the central processor or the graphics processor to process the video file to be played according to the video type and display it on the screen, specifically, to obtain the type of the video file to be played To determine whether the type of the video file conforms to the specified type, if it matches the specified type, select to control the graphics processor to display the to-be-played video file on the screen after processing, so that not all graphics processors are selected by default Or the central processor processes the to-be-played video file, but selectively selects a graphics processor to process the to-be-played video file according to the type of the to-be-played video file, which makes the choice of the graphics processor more reasonable.
  • FIG. 1 shows a block diagram of a video playback architecture provided by an embodiment of the present application
  • FIG. 2 shows a block diagram of an image rendering architecture provided by an embodiment of the present application
  • FIG. 3 shows a method flowchart of a video processing method provided by an embodiment of the present application
  • FIG. 4 shows a method flowchart of a video processing method provided by another embodiment of the present application.
  • FIG. 5 shows a method flowchart of a video processing method provided by another embodiment of the present application.
  • FIG. 6 shows a method flowchart of a video processing method provided by yet another embodiment of the present application.
  • FIG. 7 shows a block diagram of a video processing device provided by an embodiment of the present application.
  • FIG. 8 shows a block diagram of a video processing apparatus provided by another embodiment of the present application.
  • FIG. 9 shows a structural block diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 10 shows a storage unit for storing or carrying program codes for implementing a video processing method according to an embodiment of the present application.
  • FIG. 1 shows a block diagram of a video playback architecture.
  • the next job is to analyze the audio and video data.
  • General video files are composed of two parts: video stream and audio stream.
  • the packaging formats of audio and video in different video formats are definitely different.
  • the process of synthesizing audio and video streams into files is called muxer, while the process of separating audio and video streams from media files is called demuxer.
  • muxer the process of separating audio and video streams from media files
  • demuxer demuxer
  • To play video files you need to separate audio and video streams from the file stream.
  • It decodes the decoded video frames can be directly rendered, and the audio frames can be sent to the buffer of the audio output device for playback.
  • the time stamps of video rendering and audio playback must be synchronized.
  • video decoding may include hard decoding and soft decoding.
  • hardware decoding a part of the video data that is originally handed over to the central processing unit (Central Processing Unit, CPU) for processing by the image processor (Graphics Processing Unit, GPU) To do it, and the parallel computing capability of the GPU is much higher than that of the CPU, which can greatly reduce the load on the CPU. After the CPU occupancy rate is low, you can run some other programs at the same time.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the hardware solution and software are selected according to the requirements.
  • the multimedia framework obtains the video file to be played by the client through an API interface with the client, and passes it to the video decoder.
  • the multimedia framework (Media Framework) is a multimedia framework in the Android system, MediaPlayer, MediaPlayerService and Stagefrightplayer constitute the basic framework of Android multimedia.
  • the multimedia framework part adopts the C/S structure.
  • MediaPlayer serves as the client terminal of the C/S structure.
  • MediaPlayerService and Stagefrightplayer serve as the server terminal of the C/S structure. They assume the responsibility of playing multimedia files.
  • Video Decode is a super decoder that integrates the most commonly used audio and video decoding and playback to decode video data.
  • Soft decoding that is, the CPU is used to decode the video through software. After decoding, the GPU is called to render the video and merge it and display it on the screen. Hard decoding means that the video decoding task can be completed independently through a dedicated daughter card device without resorting to the CPU.
  • the decoded video data will be sent to the layer transfer module (SurfaceFlinger), and SurfaceFlinger will render and synthesize the decoded video data on the display screen.
  • SurfaceFlinger is an independent Service, which receives all Window’s Surface as input, calculates the position of each Surface in the final composite image according to ZOrder, transparency, size, position and other parameters, and then hands it to HWComposer or OpenGL to generate the final The display buffer is then displayed on a specific display device.
  • the CPU decodes the video data to SurfaceFlinger for rendering and synthesis
  • hard decoding is decoded by the GPU and then rendered and synthesized by SurfaceFlinger.
  • the SurfaceFlinger will call the GPU to render and synthesize the image and display it on the display.
  • the process of image rendering is shown in FIG. 2, the CPU obtains the video file to be played sent by the client, obtains the decoded video data after decoding, sends the video data to the GPU, and renders after the GPU rendering is completed
  • the result is put into the frame buffer (FrameBuffer in Figure 2), and then the video controller will read the data of the frame buffer line by line according to the HSync signal and pass it to the display after digital-to-analog conversion.
  • the electronic device when the electronic device obtains the video file to be played, specifically, when the CPU obtains the video file to be played, it may choose to use the CPU to decode the video file to be played, and then send the decoded data to SurfaceFlinger for rendering and synthesis, and then display Or, the CPU sends the video file to be played to the GPU, and the image processing circuit of the GPU decodes the video file to be played, and then sends it to SurfaceFlinger for display after rendering and synthesis.
  • the CPU rendering mode is generally adopted by default, but because the CPU still needs to process a large number of other operations, which are used for video rendering, it will inadvertently cause unnecessary waste of resources of electronic devices.
  • the GPU rendering mode is adopted by default, although GPU processing of images can slow down the load pressure of the CPU, it takes up a lot of memory, which makes the choice of whether the image processor or the central processor handles video files is not reasonable.
  • an embodiment of the present application provides a video processing method, which is applied to an electronic device, and the electronic device further includes a central processor, a screen, and a graphics processor, which are implemented in the present application
  • the central processing unit is the execution subject, the method includes: S301 to S303.
  • the electronic device when the client of the electronic device plays the video, the electronic device can obtain the video file to be played, and then decode the video file. Specifically, the above-mentioned soft decoding or hard decoding can be used to decode the video file. After that, the multi-frame image data to be rendered corresponding to the video file can be obtained, and then the multi-frame image data needs to be rendered before it can be displayed on the display screen.
  • the electronic device includes a central processor and an image processor, and a specific implementation method for acquiring multi-frame image data to be rendered corresponding to a video file
  • the central processor acquires a video file to be played sent by a client as an implementation method
  • the central processor obtains the video playback request sent by the client, the video playback request includes the video file to be played, specifically, the video playback request includes the identity information of the video file to be played, the identity information may be the video file Name, based on the identity information of the video file, the video file can be found in the storage space where the video file is stored.
  • the video playback module in the electronic device is called to parse and decode the video file to be played.
  • the client has an icon on the system desktop, and the user clicks on the client’s
  • the icon can open the client, for example, to confirm from the package name of the application clicked by the user.
  • the package name of the video application can be obtained from the code in the background of the system.
  • the package name format is: com.android.video.
  • the video list interface of the client displays the display content corresponding to multiple videos.
  • the display content corresponding to multiple videos includes a thumbnail corresponding to each video.
  • the thumbnail can be used as a touch button. The user clicks on the thumbnail. It can detect the thumbnail that the user wants to click, and can also determine the video file to be played.
  • the client responds to the video selected by the user in the video list, enters the video playback interface, and clicks the play button on the playback interface.
  • the client can detect what type of video the user is currently clicking by monitoring the user's touch operation Specifically, the playback button is set with a preset attribute, and it is possible to determine the video to be played selected by the user by detecting the acquired attribute of the playback button corresponding to the touch operation.
  • the client is a video chat client.
  • the generated video file is a video file to be played
  • the video file to be played is video data generated during voice chat.
  • each video in the video category interface of the client corresponds to a video file to be played.
  • the central processor detects the current video playback request, and the video playback request may be generated when the user triggers the client to set the playback button corresponding to the video file.
  • the video type of the video file to be played is determined.
  • the video type may be a video file format, for example, mpg, mpeg, dat, mp4, 3gp, mov, rm, ram, rmvb, wmv, etc.
  • the video type can also be distinguished according to whether the video file is a video played online, for example, the video type can be divided into online video and offline video, then the online video is the video played online by the client, and the offline video is played by the client Of pre-stored video files locally on the electronic device.
  • the video type can also be distinguished according to the resolution of the video, for example, standard definition, super definition, high definition, etc. For example, the type of video file with a physical resolution below 1280P*720P is standard definition.
  • the designated type is a preset video type of a video file processed by a graphics processor, the setting standards for the video type of the video file to be played are different, and it is judged whether the video type is the standard of the designated type Also different.
  • the video type of the video file to be played may be the format of the video file to be played, then the specified type may be a specified video file format, for example, may be MP4, etc., if the video type of the video file to be played is In MP4 format, the video type is determined to be the specified type; otherwise, the video type is determined not to be the specified type.
  • the video type of the video file to be played may be a type that is divided according to the resolution of the video such as standard definition, super clear, and high definition
  • the specified type may be a specified resolution, for example, may be super clear
  • the video type of the video file to be played may be online video or offline video, and the specified type may be online video. If the video type of the video file to be played is online video, the video type is determined to be the specified type Otherwise, it is determined that the video type is not the specified type.
  • S303 Control the graphics processor to process the video file to be played and display it on the screen.
  • the graphics processor is controlled to display the video file to be played after processing the video file. If the video type of the video file to be played is not the specified type, the graphics processor may be controlled to display the video file to be played and displayed on the screen, or the central processor may be controlled to play the video file The video file is displayed on the screen after processing. In other words, the central processor can be used to decode the video file to be played, or the graphics processor can be used to decode the video file to be played.
  • the graphics processor is controlled to display the video file to be played on the screen after processing, and the video file to be played is called to decode the graphics processor, specifically, The central processor sends the video file to be played to the graphics processor. After acquiring the video file to be played, the graphics processor decodes, renders and synthesizes the video file to be played and displays it on the screen.
  • the graphics processor decodes the video file to be played, thereby obtaining multi-frame image data corresponding to the video file to be played.
  • the central processor calls the playback module to parse the video file to be played, so as to obtain the video stream and the audio stream corresponding to the video file to be played.
  • the playback module can be the MediaExtractor module in the Android system, or it can be the FFmpeg module.
  • the FFmpeg module is an open source cross-platform video and audio streaming framework. It belongs to free software and uses the LGPL or GPL license (depending on the choice of Components). It provides a complete solution for recording, converting and streaming audio and video. It contains a rich audio/video codec library libavcodec.
  • the central processor sends the video stream to the graphics processor.
  • the graphics processor obtains the multi-frame image data corresponding to the video stream, and then synthesizes the multi-frame image data, specifically, the synthesis method It can be synthesized in the frame buffer shown in FIG. 2, that is, multi-frame image data is rendered and synthesized by on-screen rendering, or multi-frame image data is rendered and synthesized by off-screen rendering.
  • an off-screen rendering buffer is set in the GPU in advance.
  • the GPU will call the rendering client module to render and synthesize the multi-frame image data to be rendered and send it to the display screen for display.
  • the rendering The client module can be an OpenGL module.
  • the final position of the OpenGL rendering pipeline is in the frame buffer.
  • Frame buffer is a series of two-dimensional pixel storage array, including color buffer, depth buffer, template buffer and accumulation buffer.
  • OpenGL uses the frame buffer provided by the window system.
  • OpenGL's GL_ARB_framebuffer_object extension provides a way to create additional frame buffer objects (Frame Buffer Object, FBO). Using the frame buffer object, OpenGL can redirect the frame buffer originally drawn to the window to the FBO.
  • FBO Frame Buffer Object
  • the off-screen rendering buffer may be a storage space corresponding to the graphics processor, that is, the off-screen rendering buffer itself has no space for storing images, but after mapping with a storage space in the graphics processor, the image is actually Stored in a storage space in the graphics processor corresponding to the off-screen rendering buffer.
  • the multi-frame image data can be stored in the off-screen rendering buffer, that is, the multi-frame image data can be found in the off-screen rendering buffer.
  • the multi-frame image data can be rendered in the off-screen rendering buffer.
  • the multi-frame buffer data can be subjected to display enhancement processing, for example, The image parameter optimization of the multi-frame image data in the off-screen rendering buffer, wherein the image parameter optimization includes at least one of exposure enhancement, denoising, edge sharpening, contrast increase, or saturation increase.
  • the frame buffer corresponds to the screen and is used to store data that needs to be displayed on the screen, such as the Framebuffer shown in FIG. A driver interface in the operating system kernel.
  • Linux works in protected mode, so user-mode processes cannot use the interrupt call provided in the BIOS of the graphics card to directly write data and display it on the screen, as Linux does.
  • This device is used by the user process to directly write data and display it on the screen.
  • the Framebuffer mechanism imitates the function of the graphics card, and can directly operate the video memory through the reading and writing of the Framebuffer.
  • the Framebuffer can be regarded as an image of the display memory, and after it is mapped into the process address space, it can be directly read and written, and the written data can be displayed on the screen.
  • the frame buffer can be regarded as a space for storing data.
  • the CPU or GPU puts the data to be displayed into the frame buffer, and the Framebuffer itself does not have any ability to calculate data.
  • the video controller reads the Framebuffer according to the screen refresh frequency The data inside is displayed on the screen.
  • Multi-frame image data is read from the frame buffer and displayed on the screen. Specifically, after storing the multi-frame image data in the frame buffer, after the graphics processor detects that the frame buffer has written data, the optimized multi-frame image data is read from the frame buffer, and the Displayed on the screen.
  • the graphics processor reads multiple frames of image data from the frame buffer frame by frame according to the refresh rate of the screen, and displays them on the screen after rendering and synthesis processing.
  • FIG. 4 Another embodiment of the present application provides a video processing method, which is applied to an electronic device.
  • the electronic device further includes a central processor, a screen, and a graphics processor.
  • the central processor is used as an execution subject, and the method includes: S401 to S404.
  • S402 Determine the real-time level of the video file to be played according to the video type.
  • the real-time level of the online video is set to level 1
  • the real-time level of the offline video is set to level 2, wherein, Level 1 is higher than level 2.
  • the video type may also correspond to the type of application program.
  • the application program includes video software, social software, and game software
  • the video type may also include video software video, social software video, and game software video.
  • a specific implementation manner of determining the real-time level of the video file to be played according to the video type is to determine the real-time level of the video file to be played according to the category of the application program corresponding to the video file to be played. Specifically, the identifier of the application corresponding to the video file to be played is determined, and then the real-time level of the video file to be played is determined according to the identifier of the application. Specifically, the identifier of the target application program that sends the playback request of the video file to be played is determined, and then the type of the application program corresponding to the identifier of the target application program is determined.
  • the type of the target application is determined according to the identification, where the identification of the target application may be the package name, name, etc. of the application. For example, the correspondence between the identification of the application program and the category of the application program is pre-stored in the electronic device.
  • the game screen of the game software includes images and music.
  • the music may be game music, game sound effects, etc.
  • the game sound effects may be gunshots or footsteps.
  • the audio data may be a game sound effect
  • the application corresponding to the game sound effect is a certain game APP
  • the category it belongs to is a game type
  • the video of the video data to be played is determined according to the type of application
  • the type specifically, the type of the application program can be used as the video type of the video data to be played. For example, if the type of the application program is a game, the video type of the video data to be played is also a game.
  • the category of the above application may be a category set by the developer of the application when the application is developed, or a category set by the user for the application after the application is installed on the electronic device, for example, the user Install an application on the electronic device. After the installation is complete and you enter the application, a dialog box is displayed to instruct the user to set the category for the application.
  • the specific category to which the application belongs can be set by the user according to requirements. For example, the user can set a social software as audio, video, or social.
  • the category of the application can be determined according to the application's usage records, that is, according to the application's usage records within a certain period of time, determine whether the user is inclined to play videos or use the application More inclined to play audio.
  • the operation behavior data of all users of the application within a preset time period is obtained, where all users refer to all users who have installed the application, then the operation behavior data can be obtained from the server corresponding to the application That is to say, when the user uses the application, the user account corresponding to the user will be used to log in to the application, and the operation behavior data corresponding to the user account will be sent to the server corresponding to the application, then the server will obtain the operation behavior The data is stored corresponding to the user account.
  • the electronic device sends a query request for the operation behavior of the application to the server corresponding to the application, and the server sends the operation behavior data of all users within a certain preset time period to the electronic device.
  • the operation behavior data includes the name and time of the audio file played, and the name and time of the video file played.
  • the number and total playing time of audio files can also be obtained by the number of video files played by the application and the total playing time of video files, according to the proportion of the total playing time of audio files and video files in the predetermined time period, Determine the category of the application, specifically, obtain the proportion of the total playing time of the audio file and the video file in the preset time period, for convenience of description, the total playing time of the audio file in the preset time period
  • the ratio is recorded as the proportion of audio playback, and the proportion of the total duration of the video file playback within the preset time period is recorded as the proportion of video playback.
  • the proportion of video playback is greater than the proportion of audio playback, then set the category of the application Set as the video type, if the proportion of audio playback is greater than the proportion of video playback, then set the category of the application to the audio type. For example, if the preset time period is 30 days, that is, 720 hours, and the total duration of audio file playback is 200 hours, then the proportion of audio playback is 27.8%, and the total duration of video file playback is 330 hours, then the proportion of video playback is 45.8%, the proportion of video playback is greater than the proportion of audio playback, then the category of the application is set to the video type.
  • the real-time level corresponding to the video data to be played is determined.
  • the electronic device stores the real-time level corresponding to the type of application program, as shown in the following table:
  • the real-time level corresponding to the video file to be played can be determined.
  • S403 Determine whether the real-time level is higher than the specified level.
  • the video type is the specified type; if it is less than or equal to the specified level, it is determined that the video type is not the specified type.
  • the implementation manner for determining whether the real-time level is higher than the specified level is also different.
  • the specified level is a real-time level corresponding to a preset type that needs to reduce audio playback delay, which may be set by the user according to needs.
  • the preset level is J2 and above. If the real-time level corresponding to the video file to be played is J1, the real-time level of the video file to be played is higher than the specified level. Among them, the level of J1 is higher than that of J2, that is, Table 1, above, the smaller the number after J, the higher the level.
  • the real-time level of the online video is set to level 1
  • the real-time level of the offline video is set to level 2, wherein , Level 1 is higher than level 2.
  • the specified level may be level 2, and the level higher than level 2 is level 1, that is, if the video file to be played is an online video, the real-time level is determined to be higher than the specified level, otherwise, the real-time level is determined to be low At or equal to the specified level.
  • S404 Control the graphics processor to process the video file to be played and display it on the screen.
  • FIG. 5 shows a video processing method provided by an embodiment of the present application.
  • S502 Determine whether the video type meets the specified type.
  • the image size of the video file to be played may include image data size and image size, where the image data size is recorded as the data size of the specified frame image of the video file, that is, the size of the storage space occupied by the specified frame image, for example, specified If the size of the frame image is 1M, the image data size is 1M, where the data size of the specified frame image can be the arithmetic value of the data size of all frame images of the video file to be played, where the arithmetic value can be the average value, the minimum value, or The maximum value can also be the data size of the first frame image of the video file to be played, or it can be the average value, minimum value or maximum value of the data size of all key frame images of the video file to be played.
  • the playing video file may be an online video file, and the data size of the designated frame image of the video file to be played may be an arithmetic value of the data size of all frame images of the currently playing video file to be played.
  • the image size may be the physical resolution of the video file to be played, that is, the image resolution of the video file to be played.
  • S504 Determine whether the image size is greater than a threshold.
  • the specific implementation method for determining whether the image size is greater than the threshold is to determine whether the image data size is greater than the threshold. If it is greater than the threshold, it is determined that the image size meets the specified conditions, and step S505 is executed. If it is less than or equal to the threshold, it is determined that the image size does not meet the specified condition, and step S505 is not performed, where the threshold can be set according to specific usage requirements, and will not be repeated here.
  • a specific implementation manner for determining whether the image size meets the specified condition is to determine whether the image size is greater than the specified image size. If it is larger than the specified image size, it is determined that the image size meets the specified condition, and step S505 is executed. If it is less than or equal to the specified image size, it is determined that the image size does not satisfy the specified condition, and step S505 is not performed.
  • the designated image size can be set according to actual use, for example, the image size can be a resolution of 1280 ⁇ 720. If the image size of the video file to be played is greater than the resolution of 1280 ⁇ 720, it is determined that the image size meets the specified condition, and step S505 is executed. If the image size of the video file to be played is less than or equal to the resolution of 1280 ⁇ 720, it is determined that the image size does not satisfy the specified condition, and step S505 is not executed.
  • S505 Control the graphics processor to process the video file to be played and display it on the screen.
  • S506 Control the central processor to process the video file to be played and display it on the screen.
  • the specific implementation manner of the central processor processing the video file to be displayed on the screen after processing may be to let the CPU decode the video, and after decoding, call the GPU to render and merge the video and display it on the screen . Then, on-screen rendering may be used, or off-screen rendering may be used to render and synthesize the multi-frame image data obtained after decoding the CPU and display it on the screen.
  • the central processor is controlled to display the video file to be played on the screen after processing, that is, the central processing is used
  • the processor displays the to-be-played video file on the screen after processing.
  • the processor can be freely selected for processing.
  • the central processor may be controlled to display the video file to be played after processing the video file.
  • FIG. 6, shows the application A method flowchart of a video processing method provided by an embodiment, the method includes: S601 to S604.
  • S602 Determine whether the video type meets the specified type.
  • S603 Control the graphics processor to process the video file to be played and display it on the screen.
  • S604 Control the central processor to process the video file to be played and display it on the screen.
  • the graphics processor is controlled to display the to-be-played video file on the screen, and if the video type is not the specified type, the central processor is controlled Play the video file and display it on the screen after processing, specifically, control the graphics processor to display the video file to be played after processing or control the central processor to display the video file to be played
  • the graphics processor controls the graphics processor to display the video file to be played after processing
  • the central processor controls the central processor to display the video file to be played
  • the central processor when using the central processor to play video files, in addition to processing the video file, the central processor also needs to execute other operation instructions of other electronic devices, and each application program changes to occupy certain resources of the central processor. , The CPU usage rate that is used immediately may cause the CPU's current load to be too high, which is not suitable for processing video files.
  • the central processor is controlled to control the video to be played
  • An implementation manner of displaying on the screen after the file processing may be: if it is not of a specified type, obtain the utilization rate of the central processor; determine whether the utilization rate of the central processor is less than the utilization rate threshold; If it is less than the usage threshold, the central processor is controlled to display the to-be-played video file on the screen; if greater than or equal to the usage threshold, the graphics processor is controlled to display the to-be-played video file After processing, it is displayed on the screen.
  • the utilization rate of the central processing unit can be obtained by viewing the task manager of the electronic device.
  • the CPU utilization rate is obtained through the adb shell command.
  • the usage threshold may be a usage rate set by a user.
  • the usage threshold may be 60%. Assuming that the current usage rate of the CPU is 40%, 40% is less than 60%, and the CPU usage rate is determined to be less than The utilization rate threshold, if the current utilization rate of the CPU is 70%, then 70% is greater than 60%, and it is determined that the utilization rate of the central processor is greater than the utilization rate threshold.
  • the CPU usage is less than the usage threshold, it means that the current CPU resources are relatively rich, you can use the CPU to process the video file to be played, and if the CPU usage is greater than or equal to the usage threshold, it means that the CPU is currently If the resources are scarce, you can use the CPU to process the video file to be played.
  • the current usage rate of each opened application can be obtained to determine whether there is a preset application among the currently opened applications. Matching applications, where the preset application is an application that allows the system to close the application without authorization from the user. If it exists, close the application that matches the preset application before acquiring the CPU The current usage rate is used as the CPU usage rate, and returns to perform an operation to determine whether the usage rate of the central processing unit is less than the usage rate threshold.
  • a list of preset application programs is pre-stored in the electronic device, and the list of preset application programs includes a plurality of identifiers of specified application programs, where the specified application program is authorized by the user and allows the system
  • the application program that closes the application program may specifically be a user manually inputting the identifier of the specified application program.
  • the system will be allowed to kill the process of the application that the application is closed without user authorization, thus freeing up certain CPU resources, if the CPU usage at this time is less than the use Rate threshold, then use the central processing unit to process the video, if it is still greater than or equal to the usage threshold, use the graphics processor to process the video.
  • FIG. 7 shows a structural block diagram of a video processing device provided by an embodiment of the present application.
  • the video processing device 700 includes: an obtaining unit 701, a judging unit 702, and a processing unit 703.
  • the obtaining unit 701 is used to obtain the video type of the video file to be played.
  • the judging unit 702 is used to judge whether the video type conforms to the specified type.
  • the processing unit 703 is configured to control the graphics processor to process the video file to be played and display it on the screen if it is a specified type.
  • FIG. 8 shows a structural block diagram of a video processing device provided by an embodiment of the present application.
  • the video processing device 800 includes: an obtaining unit 801, a judging unit 802, a processing unit 803, and a display unit 804.
  • the obtaining unit 801 is used to obtain the video type of the video file to be played.
  • the judging unit 802 is used to judge whether the video type conforms to the specified type.
  • the judging unit 802 is further configured to determine the real-time level of the video file to be played according to the video type; determine whether the real-time level is higher than a specified level; It is the specified type; if it is lower than or equal to the specified level, it is determined that the video type is not the specified type.
  • the processing unit 803 is configured to control the graphics processor to process the video file to be played and display it on the screen if it is a specified type.
  • the processing unit 803 is further configured to acquire the image size of the video file to be played; determine whether the image size is greater than a threshold; if it is greater than the threshold, control the graphics processor to process the video file to be played Displayed on the screen.
  • processing unit 803 is further configured to control the central processor to process the video file to be played and display it on the screen if it is not greater than the threshold.
  • processing unit 803 is further used to determine whether the image resolution is greater than the specified resolution; if it is greater than the specified resolution, it is determined that the image size is greater than a threshold; if it is less than or equal to the specified image resolution, the image is determined The size is less than or equal to the specified condition.
  • the display unit 804 is configured to control the central processor to process the video file to be played and display it on the screen if it is not a designated type.
  • the display unit 804 is also used to obtain the utilization rate of the central processor if it is not the specified type; determine whether the utilization rate of the central processor is less than the specified value;
  • the central processor displays the to-be-played video file on the screen after processing; if greater than or equal to a specified value, controls the graphics processor to display the to-be-played video file on the screen after processing.
  • the coupling between the modules may be electrical, mechanical, or other forms of coupling.
  • each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
  • the above integrated modules may be implemented in the form of hardware or software function modules.
  • the electronic device 100 may be an electronic device capable of running a client, such as a smart phone, a tablet computer, an e-book.
  • the electronic device 100 in this application may include one or more of the following components: a processor 110, a memory 120, a screen 140, and one or more clients, where one or more clients may be stored in the memory 120 and configured To be executed by one or more processors 110, one or more programs are configured to perform the method as described in the foregoing method embodiments.
  • the processor 110 may include one or more processing cores.
  • the processor 110 connects various parts of the entire electronic device 100 by using various interfaces and lines, executes or executes instructions, programs, code sets or instruction sets stored in the memory 120, and calls data stored in the memory 120 to execute Various functions and processing data of the electronic device 100.
  • the processor 110 may use at least one of digital signal processing (Digital Signal Processing, DSP), field programmable gate array (Field-Programmable Gate Array, FPGA), and programmable logic array (Programmable Logic Array, PLA).
  • DSP Digital Signal Processing
  • FPGA Field-Programmable Gate Array
  • PLA programmable logic array
  • the processor 110 may include one or a combination of one of a central processor 111 (Central Processing Unit, CPU), an image processor 112 (Graphics Processing Unit, GPU), and a modem.
  • CPU Central Processing Unit
  • image processor 112 Graphics Processing Unit
  • modem is used to handle wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 110, and may be implemented by a communication chip alone.
  • the memory 120 may include random access memory (RAM) or read-only memory (Read-Only Memory).
  • the memory 120 may be used to store instructions, programs, codes, code sets, or instruction sets.
  • the memory 120 may include a storage program area and a storage data area, where the storage program area may store instructions for implementing an operating system and instructions for implementing at least one function (such as a touch function, a sound playback function, an image playback function, etc.) , Instructions for implementing the following method embodiments.
  • the storage data area may also store data created by the electronic device 100 in use (such as a phone book, audio and video data, chat history data), and the like.
  • the screen 120 is used to display information input by the user, information provided to the user, and various graphical user interfaces of the electronic device. These graphical user interfaces may be composed of graphics, text, icons, numbers, video, and any combination thereof. In one example, the touch screen may be disposed on the display panel so as to form a whole with the display panel.
  • FIG. 10 shows a structural block diagram of a computer-readable storage medium provided by an embodiment of the present application.
  • the computer-readable storage medium 1000 stores program codes, and the program codes can be called by a processor to execute the method described in the above method embodiments.
  • the computer-readable storage medium 1000 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the computer-readable storage medium 1000 includes a non-transitory computer-readable storage medium.
  • the computer-readable storage medium 1000 has a storage space for the program code 1010 to perform any method steps in the above method. These program codes can be read from or written into one or more computer program products.
  • the program code 1010 may be compressed in an appropriate form, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Computer Graphics (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请公开了一种视频处理方法、装置、电子设备及计算机可读介质,涉及视频处理技术领域。方法包括:获取待播放视频文件的视频类型;判断所述视频类型是否为指定类型;若为指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。本申请根据该待播放视频文件的类型来选定图形处理器来处理待播放视频文件,使得图形处理器的选择更加合理。

Description

视频处理方法、装置、电子设备和计算机可读介质
相关申请的交叉引用
本申请要求于2018年11月27日提交中国专利局的申请号为CN 201811428013.3、名称为“视频处理方法、装置、电子设备及计算机可读介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及视频处理技术领域,更具体地,涉及一种视频处理方法、装置、电子设备及计算机可读介质。
背景技术
随着电子技术和信息技术的发展,越来越多的设备能够播放视频。设备在视频播放的过程中,需要对视频执行解码、渲染以及合成等操作,再在显示屏上显示,但是,现有的视频播放技术中,虽然图形处理器处理图像能够减缓中央处理器的负载压力,但是,其占用大量内存,因此,合理选择处理器来处理视频至关重要。
发明内容
本申请提出了一种视频处理方法、装置、电子设备及计算机可读介质,以改善上述缺陷。
第一方面,本申请实施例提供了一种视频处理方法,应用于电子设备的中央处理器,所述电子设备还包括屏幕和图形处理器,所述方法包括:获取待播放视频文件的视频类型;判断所述视频类型是否符合指定类型;若符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
第二方面,本申请实施例还提供了一种视频处理装置,应用于电子设备的中央处理器,所述电子设备还包括屏幕和图形处理器。所述视频处理装置包括:获取单元、判断单元和处理单元。获取单元,用于获取待播放视频文件的视频类型。判断单元,用于判断所述视频类型是否符合指定类型。处理单元,用于若符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
第三方面,本申请实施例还提供了一种电子设备,包括:中央处理器和图形处理器;存储器;屏幕;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述中央处理器执行,所述一个或多个程序配置用于执行上述方法。
第四方面,本申请实施例还提供了一种计算机可读取存储介质,计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行上述方法。
本申请提供的方案,在获取到待播放视频文件之后,会根据视频类型选择中央处理 器还是图形处理器来处理该待播放视频文件而在屏幕上显示,具体地,获取待播放视频文件的类型,判断该视频文件的类型是否符合指定类型,如果符合指定类型,则选择控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示,从而并非默认都选择使用图形处理器或者中央处理器来处理待播放视频文件,而是有选择性地根据该待播放视频文件的类型来选定图形处理器来处理待播放视频文件,使得图形处理器的选择更加合理。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了本申请一实施例提供的视频播放架构的框图;
图2示出了本申请实施例提供的图像渲染架构的框图;
图3示出了本申请一实施例提供的视频处理方法的方法流程图;
图4示出了本申请另一实施例提供的视频处理方法的方法流程图;
图5示出了本申请又一实施例提供的视频处理方法的方法流程图;
图6示出了本申请再又一实施例提供的视频处理方法的方法流程图;
图7示出了本申请一实施例提供的视频处理装置的模块框图;
图8示出了本申请另一实施例提供的视频处理装置的模块框图;
图9示出了本申请实施例提供的电子设备的结构框图;
图10示出了本申请实施例的用于保存或者携带实现根据本申请实施例的视频处理方法的程序代码的存储单元。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。
请参阅图1,示出了视频播放架构的框图。具体地,操作系统在获取到待播放的数据的时候,接下来的工作就是解析音视频数据了。一般的视频文件都有视频流和音频流两部分组成,不同的视频格式音视频的封装格式肯定不一样。将音频流和视频流合成文件的过程称为muxer,反之从媒体文件中分离音频流和视频流的过程称为demuxer.播放视频文件就需要从文件流中分离出音频流和视频流,分别对其进行解码,解码后的视频帧可以直接渲染,音频帧可以送到音频输出设备的缓冲区进行播放,当然,视频渲染和音频播放的时间戳一定要控制同步。
具体地,视频解码可以包括硬解码和软解码,硬件解码是将原来全部交由中央处理器(Central Processing Unit,CPU)来处理的视频数据的一部分交由图像处理器(Graphics Processing Unit,GPU)来做,而GPU的并行运算能力要远远高于CPU,这样可以大大的降低对CPU的负载,CPU的占用率较低了之后就可以同时运行一些其他的程序了,当然,对于较好的处理器来说,比如i5 2320,或者AMD任何一款四核心处理器来说,硬解和软件依据需求而选定。
具体地,如图1所示,多媒体框架通过与客户端的API接口获取客户端待播放的视频文件,并交由视频解码器,其中,多媒体框架(Media Framework)为Android系统 中多媒体框架,MediaPlayer、MediaPlayerService和Stagefrightplayer三个部分构成了Android多媒体的基本框架。多媒体框架部分采用了C/S的结构,MediaPlayer作为C/S结构的Client端,MediaPlayerService和Stagefrightplayer作为C/S结构Server端,承担着播放多媒体文件的责任,通过Stagefrightplayer,Server端完成Client端的请求并作出响应。Video Decode是一款集成了最常用的音频和视频解码与播放的超级解码器,用于将视频数据解码。
软解码,即通过软件让CPU来对视频进行解码处理,解码之后再调用GPU对视频渲染合并之后在屏幕上显示。而硬解码,指不借助于CPU,而通过专用的子卡设备来独立完成视频解码任务。
不论是硬解码还是软解码,在将视频数据解码之后,会将解码后的视频数据发送至图层传递模块(SurfaceFlinger),由SurfaceFlinger将解码后的视频数据渲染和合成之后,在显示屏上显示。其中,SurfaceFlinger是一个独立的Service,它接收所有Window的Surface作为输入,根据ZOrder、透明度、大小、位置等参数,计算出每个Surface在最终合成图像中的位置,然后交由HWComposer或OpenGL生成最终的显示Buffer,然后显示到特定的显示设备上。
如图1所示,软解码中,CPU将视频数据解码之后交给SurfaceFlinger渲染和合成,而硬解码由GPU解码之后,交由SurfaceFlinger渲染和合成。而SurfaceFlinger会调用GPU实现图像的渲染和合成,并在显示屏上显示。
作为一种实施方式,图像渲染的过程如图2所示,CPU获取客户端发送的待播放的视频文件,解码之后获取解码之后的视频数据,将视频数据发送至GPU,GPU渲染完成后将渲染结果放入帧缓冲区(如图2中的FrameBuffer),随后视频控制器会按照HSync信号逐行读取帧缓冲区的数据,经过数模转换传递给显示器显示。
则电子设备在获取到待播放视频文件,具体地,CPU在获取到待播放视频文件的时候,可以选择使用CPU将待播放视频文件解码,然后将解码后的数据发送至SurfaceFlinger渲染和合成之后显示,也可以是,CPU将待播放视频文件发送至GPU,由GPU的图像处理电路对待播放视频文件解码之后,发送至SurfaceFlinger渲染和合成之后显示。目前普遍默认采用的是CPU渲染模式,但是由于CPU还需要处理大量其他的操作,其用于视频渲染,会无形中给电子设备的资源造成没必要的浪费。而如果默认采用GPU渲染模式,虽然GPU处理图像能够减缓CPU的负载压力,但是,其占用大量内存,导致现在选择图像处理器还是中央处理器处理视频文件的选择不够合理。
因此,为了解决上述技术缺陷,如图3所示,本申请实施例提供了一种视频处理方法,应用于电子设备,该电子设备还包括中央处理器、屏幕和图形处理器,于本申请实施例中,以中央处理器为执行主体,则该方法包括:S301至S303。
S301:获取待播放视频文件的视频类型。
具体地,当电子设备的客户端播放视频的时候,电子设备能够获取欲播放的视频文件,然后再对视频文件解码,具体地,可以采用上述的软解码或者硬解码对视频文件解码,在解码之后就能够获取视频文件对应的待渲染的多帧图像数据,之后需要将多帧图像数据渲染之后才能够在显示屏上显示。
具体地,电子设备包括中央处理器和图像处理器,获取视频文件对应的待渲染的多帧图像数据的具体实施方式,中央处理器获取客户端发送的待播放的视频文件,作为一种实施方式,中央处理器获取客户端发送的视频播放请求,该视频播放请求包括待播放的视频文件,具体地,可以是视频播放请求包括待播放的视频文件的身份信息,该身份信息可以是视频文件的名称,基于该视频文件的身份信息能够由存储该视频文件的存储空间内查找到该视频文件。
作为一种实施方式,客户端在播放视频的时候,会调用电子设备内的视频播放模块,以对该待播放视频文件解析和解码,则客户端在系统桌面设有图标,用户点击该客户端的图标,能够将该客户端打开,例如,从用户点击的应用的包名来确认,视频应用的包名可以系统后台从代码中获取,包名格式为:com.android.video。
客户端的视频列表界面内显示有多个视频对应的显示内容,多个视频对应的显示内容包括每个视频对应的缩略图,该缩略图可以作为一个触摸按键使用,用户点击该缩略图,客户端能够检测到用户所想点击的缩略图,也就能够确定欲播放的待播放视频文件。
客户端响应用户在视频列表内选中的视频,进入视频的播放界面,点击该播放界面的播放按钮,客户端通过对用户触控操作的监听,能够检测到用户当前所点击的是什么类型的视频,具体地,播放按钮设置有预设属性,检测到所获取的触控操作对应的播放按钮的属性,就能够确定用户所选中的待播放视频。
例如,客户端为视频聊天客户端,点击视频聊天按钮之后,所产生的视频文件为待播放视频文件,则该待播放视频文件为语音聊天时产生的视频数据。再例如,客户端的视频类别界面内的每个视频都对应一个待播放视频文件。
则中央处理器检测当前的视频播放请求,该视频播放请求可以是用户触发上述的客户端设置与视频文件对应的播放按键时所产生的。则在获取到待播放视频文件之后,再确定待播放视频文件的视频类型。
其中,该视频类型可以是视频文件的格式,例如,mpg、mpeg、dat、mp4、3gp、mov、rm、ram、rmvb、wmv等。另外,该视频类型还可以根据视频文件是否为在线播放的视频而区分,例如,视频类型可以分为在线视频和离线视频,则在线视频为客户端在线播放的视频,而离线视频为客户端播放的预先存储在电子设备本地的视频文件。再者,该视频类型还可以是根据视频的分辨率而区分,例如,标清、超清、高清等。例如,物理分辨率在1280P*720P以下的视频文件的类型是标清。
S302:判断所述视频类型是否符合指定类型。
其中,指定类型为预先设定的设定通过图形处理器处理的视频文件的视频类型,则针对待播放视频文件的视频类型的设定标准不同,而判断所述视频类型是否为指定类型的标准也不同。
作为一种实施方式,该待播放视频文件的视频类型可以是待播放视频文件的格式,则指定类型可以为指定视频文件格式,例如,可以是MP4等,则如果待播放视频文件的视频类型为MP4格式,判定视频类型为指定类型,否则,判定视频类型不为指定类型。
作为另一种实施方式,该待播放视频文件的视频类型可以是标清、超清、高清等根据视频的分辨率而划分的类型,则指定类型可以为指定分辨率,例如,可以是超清,则如果待播放视频文件的视频类型为超清,判定视频类型为指定类型,否则,判定视频类型不为指定类型。
作为又一种实施方式,该待播放视频文件的视频类型可以是在线视频或者离线视频,则指定类型可以是在线视频,则如果待播放视频文件的视频类型为在线视频,判定视频类型为指定类型,否则,判定视频类型不为指定类型。
S303:控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
若待播放视频文件的视频类型符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。若待播放视频文件的视频类型不为指定类型,则可以控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示,也可以控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。也就是说,可以采用中央处理器对待播放视频文件解码,也可以采用图形处理器对待播放视频文件解码。
若待播放视频文件的视频类型为指定类型,则控制所述图形处理器将所述待播放视 频文件处理后在所述屏幕上显示,将待播放视频文件调用图形处理器进行解码,具体地,中央处理器将待播放视频文件发送至图形处理器,图形处理器在获取到待播放视频文件之后,对该待播放视频文件解码和渲染合成之后在屏幕上显示。
作为一种实施方式,图形处理器将待播放视频文件解码,从而获得待播放视频文件对应的多帧图像数据。具体地,中央处理器调用播放模块将待播放视频文件解析,从而获取该待播放视频文件对应的视频流和音频流。其中,播放模块可以是安卓系统内的MediaExtractor模块,也可以是FFmpeg模块,其中,FFmpeg模块是一个开源的跨平台的视频和音频流框架,属于自由软件,采用LGPL或GPL许可证(依据选择的组件)。它提供了录制、转换以及流化音视频的完整解决方案。它包含了丰富的音频/视频编解码库libavcodec。
然后,中央处理器将视频流发送至图形处理器,图形处理器将视频流解码之后,获取到该视频流对应的多帧图像数据,然后,对该多帧图像数据合成,具体地,合成方式可以是在放在图2所示的帧缓冲区内合成,即通过在屏渲染的方式对多帧图像数据渲染合成,也可以是通过离屏渲染的方式对多帧图像数据渲染合成。
作为一种实施方式,预先在GPU内设置一个离屏渲染缓冲区,具体地,GPU会调用渲染客户端模块对待渲染的多帧图像数据渲染合成之后发送至显示屏上显示,具体地,该渲染客户端模块可以是OpenGL模块。OpenGL渲染管线的最终位置是在帧缓冲区中。帧缓冲区是一系列二维的像素存储数组,包括了颜色缓冲区、深度缓冲区、模板缓冲区以及累积缓冲区。默认情况下OpenGL使用的是窗口系统提供的帧缓冲区。
OpenGL的GL_ARB_framebuffer_object这个扩展提供了一种方式来创建额外的帧缓冲区对象(Frame Buffer Object,FBO)。使用帧缓冲区对象,OpenGL可以将原先绘制到窗口提供的帧缓冲区重定向到FBO之中。
则通过FBO在帧缓冲区之外再设置一个缓冲区,即离屏渲染缓冲区。然后,将所获取的多帧图像数据存储至离屏渲染缓冲区。具体地,离屏渲染缓冲区可以是对应图形处理器的一个存储空间,即离屏渲染缓冲区本身没有用于存储图像的空间,而是与图形处理器内的一个存储空间映射之后,图像实际存储在离屏渲染缓冲区对应的图形处理器内的一个存储空间内。
将多帧图像数据与离屏渲染缓冲区绑定的方式,就能够将多帧图像数据存储至离屏渲染缓冲区,即在离屏渲染缓冲区能够查找到多帧图像数据。
则在将多帧图像数据存储至离屏渲染缓冲区之后,可以在离屏渲染缓冲区对多帧图像数据进行渲染,具体地,可以对多帧缓冲数据做显示增强的处理,例如,对所述离屏渲染缓冲区内的多帧图像数据的图像参数优化,其中,所述图像参数优化包括曝光度增强、去噪、边缘锐化、对比度增加或饱和度增加的至少一种。
然后,将多帧图像数据发送至所述屏幕对应的帧缓冲区,其中,帧缓冲区对应于屏幕,用于存放需要在屏幕上显示的数据,例如图2所示的Framebuffer,Framebuffer是出现在操作系统内核当中的一种驱动程序接口。以安卓系统为例,Linux是工作在保护模式下,所以用户态进程是无法像DOS系统那样,使用显卡BIOS里提供的中断调用来实现直接将数据写入并在屏幕上显示,Linux抽象出Framebuffer这个设备来供用户进程实现直接将数据写入并在屏幕上显示。Framebuffer机制模仿显卡的功能,可以通过Framebuffer的读写直接对显存进行操作。具体地,可以将Framebuffer看成是显示内存的一个映像,将其映射到进程地址空间之后,就可以直接进行读写操作,而写入的数据可以在屏幕上显示。
则帧缓冲区可以看作是一个存放数据的空间,CPU或者GPU将要显示的数据放入该帧缓冲区,而Framebuffer本身不具备任何运算数据的能力,由视频控制器按照屏幕刷 新频率读取Framebuffer内的数据在屏幕上显示。
由所述帧缓冲区内读取多帧图像数据,并在所述屏幕上显示。具体地,将多帧图像数据存入帧缓冲区内之后,图形处理器检测到帧缓冲区内写入数据之后,就由所述帧缓冲区内读取优化后的多帧图像数据,并在所述屏幕上显示。
作为一种实施方式,图形处理器会根据屏幕的刷新频率由所述帧缓冲区内逐帧读取多帧图像数据,并经渲染合成处理后在所述屏幕上显示。
另外,考虑到图形处理器处理图像的能力比较优选,并且处理速度比较快,计算能力更强,则针对一些实时性较高的视频文件,例如,在线视频或者视频聊天软件的实时聊天视频,可以采用图形处理器来处理,具体地,请参阅图4,本申请另一实施例提供了一种视频处理方法,应用于电子设备,该电子设备还包括中央处理器、屏幕和图形处理器,于本申请实施例中,以中央处理器为执行主体,则该方法包括:S401至S404。
S401:获取待播放视频文件的视频类型。
S402:根据所述视频类型确定所述待播放视频文件的实时性级别。
作为一种实施方式,待播放时视频文件的视频类型包括在线视频和离线视频,则将在线视频的实时性级别设定为级别1,将离线视频的实时性级别设定为级别2,其中,级别1高于级别2。
当然,该视频类型还可以与应用程序的类型而对应,例如,应用程序包括视频软件、社交软件和游戏软件,则视频类型也可以包括视频软件的视频、社交软件的视频和游戏软件的视频,则根据该视频类型确定所述待播放视频文件的实时性级别的具体实施方式为,根据该待播放视频文件对应的应用程序的类别确定该待播放视频文件的实时性级别。具体地,确定待播放视频文件对应的应用程序的标识,再根据该应用程序的标识确定待播放视频文件的实时性级别。具体地,确定发送该待播放视频文件的播放请求的目标应用程序的标识,再确定该目标应用程序的标识所对应的应用程序的类型。
在获取到目标应用程序的标识之后,根据该标识确定目标应用程序的类型,其中,目标应用程序的标识可以是应用程序的包名、名称等。例如,电子设备内预先存储有应用程序的标识和应用程序的类别的对应关系。
例如,于本申请实施例中,游戏软件的游戏画面,该游戏画面包括图像以及音乐,该音乐可以是游戏音乐、游戏音效等,例如,该游戏音效可以是枪声或者脚步声。则作为一种实施例,音频数据可以是游戏音效,则该游戏音效对应的应用程序是某某游戏APP,所属的类别是游戏类型,然后根据应用程序的类型确定所述待播放视频数据的视频类型,具体地,可以将应用程序的类型作为待播放视频数据的视频类型,例如,应用程序的类型为游戏,则待播放视频数据的视频类型也为游戏。
上述应用程序的类别,可以是应用程序的开发商在开发的时候为应用程序设定的类别,也可以是应用程序在安装在电子设备上之后,用户为应用程序设定的类别,例如,用户在电子设备上安装某个应用程序,在安装完成并进入该应用程序之后,会显示一个对话框,指示用户为应用程序设定类别。则应用程序具体属于哪个类别,可以由用户根据需求而设定,例如,用户可以将某社交软件设置为音频类,或者设置为视频类,或者设置为社交类。
如果有些应用程序的功能多样化,则需要根据应用程序的具体操作行为而确定该应用程序的类别,例如,如果有些应用程序能够播放视频,也能够播放音频,例如一些视频播放软件,可以播放纯音频文件,也可以播放视频,则该应用程序的类别可以根据应用程序的使用记录而确定,即根据该应用程序的一定时间段内的使用记录,确定用户使用该应用程序是倾向于播放视频还是更倾向于播放音频。
具体地,获取该应用程序在预设时间段内的所有用户的操作行为数据,其中,所有 用户是指安装过该应用程序的所有用户,则该操作行为数据可以由应用程序对应的服务器内获取,也就是说,用户在使用该应用程序的时候会使用用户对应的用户账号登录该应用程序,而用户账号对应的操作行为数据会发送至应用程序对应的服务器,则服务器将所获取的操作行为数据与用户账号对应存储。在一些实施例中,电子设备发送针对应用程序的操作行为查询请求至该应用程序对应的服务器,服务器将一定预设时间段内的所有用户的操作行为数据发送至电子设备。
该操作行为数据包括所播放的音频文件的名称和时间、以及所播放的视频文件的名称和时间,通过分析该操作行为数据就能够确定在一定预设时间段内该应用程序播放的音频文件的数量以及音频文件的播放总时长,也可以得到该应用程序播放的视频文件的数量以及视频文件的播放总时长,则根据音频文件和视频文件的播放总时长在该预定时间段内的占比,确定应用程序的类别,具体地,获取音频文件和视频文件的播放总时长在该预设时间段内的占比,为方便描述,将音频文件的播放总时长在该预设时间段内的占比记为音频播放占比,将视频文件的播放总时长在该预设时间段内的占比记为视频播放占比,如果视频播放占比大于音频播放占比,则将应用程序的类别设定为视频类型,如果音频播放占比大于视频播放占比,则将应用程序的类别设定为音频类型。例如,预设时间段为30天,即720小时,而音频文件的播放总时长为200小时,则音频播放占比为27.8%,视频文件的播放总时长为330小时,则视频播放占比为45.8%,则视频播放占比大于音频播放占比,则将应用程序的类别设定为视频类型。
然后,再根据该应用程序的类型确定待播放视频数据对应的实时性级别。具体地,电子设备内存储有应用程序的类型所对应实时性级别,如下表所示:
应用程序的标识 应用程序的类别 实时性级别
Apk1 游戏 J1
Apk2 视频 J2
Apk3 音频 J3
Apk4 社交 J1
从而就能够确定待播放视频文件对应的实时性级别。
S403:判断所述实时性级别是否高于指定级别。
其中,若高于指定级别,则判定所述视频类型为指定类型;若小于或等于指定级别,则判定所述视频类型不为指定类型。
则针对待播放视频文件的实时性级别不同的获取方式,而判断实时性级别是否高于指定级别的实施方式也不同。
则作为一种实施方式,在获取到待播放视频文件的实时性级别之后,判断所述实时性级别是否高于指定级别。判断所述待播放视频文件的实时性级别是否高于指定级别。其中,指定级别为预先设定的需要减少音频播放延时的类型对应的实时性级别,可以是用户根据需求而设定的。例如,预设级别为J2及以上。则如果待播放视频文件对应的实时性级别为J1,则待播放视频文件的实时性级别高于指定级别。其中,J1的级别高于J2,即上述表1,J后面的数字越小则级别越高。
作为另一种实施方式,待播放时视频文件的视频类型包括在线视频和离线视频,则将在线视频的实时性级别设定为级别1,将离线视频的实时性级别设定为级别2,其中,级别1高于级别2。则该指定级别可以是级别2,则高于级别2的级别为级别1,即如果该待播放视频文件为在线视频,则判定该实时性级别高于指定级别,否则,判定该实时性级别低于或等于指定级别。
S404:控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
需要说明的是,上述步骤中未详细描述的部分,可参考前述实施例,在此不再赘述。
另外,考虑到有一些视频文件虽然满足上述指定类型的条件,但是,其图像比较小,则采用中央处理器处理的时候,不会对中央处理器造成太多的负担,则不必一定采用图形处理器来处理,则可以结合图像大小来确定是否采用图形处理器来处理,具体地,请参阅图5,示出了本申请实施例提供的一种视频处理方法,该方法包括:S501至S506。
S501:获取待播放视频文件的视频类型。
S502:判断所述视频类型是否符合指定类型。
S503:获取所述待播放视频文件的图像大小。
其中,待播放视频文件的图像大小可以包括图像数据大小和图像尺寸,其中,图像数据大小记为视频文件的指定帧图像的数据大小,即指定帧图像所占的存储空间的大小,例如,指定帧图像的大小为1M,则图像数据大小为1M,其中,指定帧图像的数据大小可以是待播放视频文件所有帧图像的数据大小的算数值,其中,算术值可以是平均值、最小值或者最大值,也可以是,待播放视频文件的第一帧图像的数据大小,还是可以是待播放视频文件的所有关键帧图像的数据大小的平均值、最小值或者最大值,另外,考虑到待播放视频文件可能是在线视频文件,则待播放视频文件的指定帧图像的数据大小可以是,获取当前所播放的待播放视频文件的所有帧图像的数据大小的算术值。
则图像尺寸可以是待播放视频文件的物理分辨率,即待播放视频文件的图像分辨率。
S504:判断所述图像大小是否大于阈值。
则如果图像大小为图像数据大小,则判断所述图像大小是否大于阈值的具体实施方式为,判断所述图像数据大小是否大于阈值,如果大于阈值,则判定图像大小满足指定条件,则执行S505步骤,如果小于或等于阈值,则判定图像大小不满足指定条件,则不执行S505步骤,其中,阈值可以根据具体使用需求而设定,在此不再赘述。
则如果图像大小为图像尺寸,则判断所述图像大小是否满足指定条件的具体实施方式为,判断所述图像尺寸是否大于指定图像尺寸。若大于指定图像尺寸,则判定所述图像大小满足指定条件,则执行S505步骤。若小于或等于指定图像尺寸,则判定所述图像大小不满足指定条件,则不执行S505步骤。
其中,指定图像尺寸可以根据实际使用而设定,例如,图像尺寸可以是1280×720的分辨率。则如果待播放视频文件的图像尺寸大于1280×720的分辨率,则判定所述图像大小满足指定条件,则执行S505步骤。而如果待播放视频文件的图像尺寸小于或等于1280×720的分辨率,判定所述图像大小不满足指定条件,则不执行S505步骤。
S505:控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
则图形处理器在处理较大的视频文件的时候,处理优势更加明显,即处理速度相比中央处理器有较明显的提升,则针对图像数据较大的视频文件或者图像尺寸较大的视频文件,采用图形处理器处理更加合理。
S506:控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
其中,中央处理器将所述待播放视频文件处理后在所述屏幕上显示的具体实施方式可以是,让CPU来对视频进行解码处理,解码之后再调用GPU对视频渲染合并之后在屏幕上显示。则可以采用在屏渲染,也可以采用离屏渲染对CPU解码之后得到的多帧图像数据渲染合成之后在屏幕上显示。
因此,在待播放视频文件的视频类型为指定类型但是图像大小不满足指定条件的情况下,控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,即使用中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
而在待播放视频文件的视频类型不为指定类型时,可以自由选择处理器来处理。而作为一种实施方式,考虑到在待播放视频文件类型不为指定类型的时候,选择图形处理 器处理视频文件的迫切性并不太高,则鉴于降低图形处理器做占用的内存的考虑,可以在待播放视频文件的视频类型不为指定类型时,控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,具体地,请参阅图6,示出了本申请实施例提供的一种视频处理方法的方法流程图,该方法包括:S601至S604。
S601:获取待播放视频文件的视频类型。
S602:判断所述视频类型是否符合指定类型。
S603:控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
S604:控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
则如果视频类型为指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示,如果视频类型不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,具体地,控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示或控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示的具体实施方式可参考前述实施例,在此不再赘述。
另外,考虑到在使用中央处理器播放视频文件的时候,由于中央处理器除了需要处理视频文件,还需要执行其他电子设备的其他操作指令,而各个应用程序都换占用中央处理器的一定的资源,即用占用CPU的使用率,则可能会导致CPU现在的负载过高,不适于处理视频文件,具体地,如果视频类型不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示的一种实施方式可以是:若不为指定类型,则获取所述中央处理器的使用率;判断所述中央处理器的使用率是否小于使用率阈值;若小于使用率阈值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示;若大于或等于使用率阈值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
则具体地,中央处理器的使用率可以通过查看电子设备的任务管理器而获取,例如,在安卓系统下,通过adb shell top指令获取CPU的使用率。其中,使用率阈值可以是用户设定的使用率,例如,使用率阈值可以是60%,假设CPU的当前的使用率为40%,则40%小于60%,判定中央处理器的使用率小于使用率阈值,假如CPU的当前的使用率为70%,则70%大于60%,判定中央处理器的使用率大于使用率阈值。
而如果中央处理器的使用率小于使用率阈值,则表示CPU当前资源比较富裕,则可以使用CPU处理待播放视频文件,而如果中央处理器的使用率大于或等于使用率阈值,则表示CPU当前资源比较匮乏,则可以不使用CPU处理待播放视频文件。
另外,由于CPU的使用率是当前启动的应用程序的使用率之和,则能够获取到当前每个所开启的应用程序的使用率,判断当前所开启的应用程序中是否存在与预设应用程序匹配的应用程序,其中,预设应用程序为允许系统在用户未授权的情况下将应用程序关闭的应用程序,如果存在,则将与预设应用程序匹配的应用程序的关闭,然后再获取CPU当前的使用率作为CPU的使用率,并返回执行判断所述中央处理器的使用率是否小于使用率阈值的操作。
具体地,电子设备内预先存储有预设应用程序的列表,在该预设应用程序的列表内包括多个指定应用程序的标识,其中,指定应用程序为用户授权的允许系统在用户未授权的情况下将应用程序关闭的应用程序,具体地,可以是用户手动输入该指定应用程序的标识。
则扫描当前系统进程中每个进程对应的应用程序以及每个进程的CPU使用率,并获取CPU当前的使用率,在所扫描到的所有应用程序中查找与预设应用程序匹配的应用程序,作为待处理应用程序,将待处理应用程序关闭并将待处理应用程序对应的进程杀死,然后,获取在待处理应用程序对应的进程杀死之后的CPU的使用率,作为更新使用率, 将更新使用率作为CPU新的使用率,判断CPU新的使用率是否小于使用率阈值;若小于使用率阈值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示;若大于或等于使用率阈值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
因此,在CPU使用率过高的情况下,将允许系统在用户未授权的情况下将应用程序关闭的应用程序的进程杀死,从而释放一定的CPU资源,如果此时的CPU使用率小于使用率阈值,则使用中央处理器处理视频,如果依然大于或等于使用率阈值,则使用图形处理器处理视频。
请参阅图7,其示出了本申请实施例提供的一种视频处理装置的结构框图,该视频处理装置700包括:获取单元701、判断单元702和处理单元703。
获取单元701,用于获取待播放视频文件的视频类型。
判断单元702,用于判断所述视频类型是否符合指定类型。
处理单元703,用于若为指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
请参阅图8,其示出了本申请实施例提供的一种视频处理装置的结构框图,该视频处理装置800包括:获取单元801、判断单元802、处理单元803和显示单元804。
获取单元801,用于获取待播放视频文件的视频类型。
判断单元802,用于判断所述视频类型是否符合指定类型。
具体地,判断单元802还用于根据所述视频类型确定所述待播放视频文件的实时性级别;判断所述实时性级别是否高于指定级别;若高于指定级别,则判定所述视频类型为指定类型;若低于或等于指定级别,则判定所述视频类型不为指定类型。
处理单元803,用于若为指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
具体地,处理单元803还用于获取所述待播放视频文件的图像大小;判断所述图像大小是否大于阈值;若大于阈值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
进一步的,处理单元803还用于若不大于阈值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
进一步的,处理单元803还用于判断所述图像分辨率是否大于指定分辨率;若大于指定分辨率,则判定所述图像大小大于阈值;若小于或等于指定图像分辨率,则判定所述图像大小小于或等于指定条件。
显示单元804,用于若不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
具体地,该显示单元804还用于若不为指定类型,则获取所述中央处理器的使用率;判断所述中央处理器的使用率是否小于指定数值;若小于指定数值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示;若大于或等于指定数值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述装置和模块的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,模块相互之间的耦合可以是电性,机械或其它形式的耦合。
另外,在本申请各个实施例中的各功能模块可以集成在一个处理模块中,也可以是 各个模块单独物理存在,也可以两个或两个以上模块集成在一个模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。
请参考图9,其示出了本申请实施例提供的一种电子设备的结构框图。该电子设备100可以是智能手机、平板电脑、电子书等能够运行客户端的电子设备。本申请中的电子设备100可以包括一个或多个如下部件:处理器110、存储器120、屏幕140以及一个或多个客户端,其中一个或多个客户端可以被存储在存储器120中并被配置为由一个或多个处理器110执行,一个或多个程序配置用于执行如前述方法实施例所描述的方法。
处理器110可以包括一个或者多个处理核。处理器110利用各种接口和线路连接整个电子设备100内的各个部分,通过运行或执行存储在存储器120内的指令、程序、代码集或指令集,以及调用存储在存储器120内的数据,执行电子设备100的各种功能和处理数据。可选地,处理器110可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。
具体地,处理器110可以包括中央处理器111(Central Processing Unit,CPU)、图像处理器112(Graphics Processing Unit,GPU)和调制解调器等中的一种或几种的组合。其中,CPU主要处理操作系统、用户界面和客户端等;GPU用于负责显示内容的渲染和绘制;调制解调器用于处理无线通信。可以理解的是,上述调制解调器也可以不集成到处理器110中,单独通过一块通信芯片进行实现。
存储器120可以包括随机存储器(Random Access Memory,RAM),也可以包括只读存储器(Read-Only Memory)。存储器120可用于存储指令、程序、代码、代码集或指令集。存储器120可包括存储程序区和存储数据区,其中,存储程序区可存储用于实现操作系统的指令、用于实现至少一个功能的指令(比如触控功能、声音播放功能、图像播放功能等)、用于实现下述各个方法实施例的指令等。存储数据区还可以存储电子设备100在使用中所创建的数据(比如电话本、音视频数据、聊天记录数据)等。
所述屏幕120用于显示由用户输入的信息、提供给用户的信息以及电子设备的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、数字、视频和其任意组合来构成,在一个实例中,触摸屏可设置于所述显示面板上从而与所述显示面板构成一个整体。
请参考图10,其示出了本申请实施例提供的一种计算机可读存储介质的结构框图。该计算机可读存储介质1000中存储有程序代码,所述程序代码可被处理器调用执行上述方法实施例中所描述的方法。
计算机可读存储介质1000可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。可选地,计算机可读存储介质1000包括非易失性计算机可读介质(non-transitory computer-readable storage medium)。计算机可读存储介质1000具有执行上述方法中的任何方法步骤的程序代码1010的存储空间。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。程序代码1010可以例如以适当形式进行压缩。
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不驱使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (20)

  1. 一种视频处理方法,其特征在于,应用于电子设备的中央处理器,所述电子设备还包括屏幕和图形处理器,所述方法包括:
    获取待播放视频文件的视频类型;
    判断所述视频类型是否符合指定类型;
    若符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
  2. 根据权利要求1所述的方法,其特征在于,所述判断所述视频类型是否符合指定类型,包括:
    判断所述待播放视频文件的分辨率是否符合指定分辨率;
    若符合指定分辨率,则判定视频类型为指定类型;
    若不符合指定分辨率,则判定视频类型不为指定类型。
  3. 根据权利要求1所述的方法,其特征在于,所述判断所述视频类型是否符合指定类型,包括:
    判断所述待播放视频文件的视频类型是否为在线视频;
    若为在线视频,则判定视频类型为指定类型;
    若不为在线视频,则判定视频类型不为指定类型。
  4. 根据权利要求1所述的方法,其特征在于,所述判断所述视频类型是否为指定类型,包括:
    根据所述视频类型确定所述待播放视频文件的实时性级别;
    判断所述实时性级别是否高于指定级别;
    若高于指定级别,则判定所述视频类型符合指定类型;
    若低于或等于指定级别,则判定所述视频类型不符合指定类型。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述视频类型确定所述待播放视频文件的实时性级别,包括:
    根据所述待播放视频文件对应的应用程序的类别,确定所述待播放视频文件的实时性级别。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述待播放视频文件对应的应用程序的类别,确定所述待播放视频文件的实时性级别,包括:
    确定所述待播放视频文件对应的应用程序的标识;
    根据所述应用程序的标识确定所述待播放视频数据的实时性级别。
  7. 根据权利要求5或6所述的方法,其特征在于,所述根据所述待播放视频文件对应的应用程序的类别,包括:
    获取所述待播放视频文件对应的应用程序在预设时间段内的所有用户的操作行为数据,其中,所有用户包括安装过该应用程序的所有用户,所述操作行为数据包括所播放的音频文件的名称和时间、以及所述应用程序所播放的视频文件的名称和时间;
    根据所述操作行为数据,确定在预设时间段内所述应用程序播放的音频文件的数量以及所述音频文件的播放总时长,以及所述应用程序播放的视频文件的数量以及所述视频文件的播放总时长;
    获取所述音频文件的播放总时长在所述预定时间段内的占比作为音频播放占比,获取所述视频文件的播放总时长在所述预设时间段内的占比作为视频播放占比;
    根据所述音频播放占比和所述视频播放占比,确定应用程序的类别。
  8. 根据权利要求7所述的方法,其特征在于,所述获取所述待播放视频文件对应的应用程序在预设时间段内的所有用户的操作行为数据,包括:
    发送针对所述应用程序的操作行为查询请求至所述应用程序对应的服务器;
    获取所述服务器返回的所述应用程序在预设时间段内的所有用户的操作行为数据。
  9. 根据权利要求7所述的方法,其特征在于,所述根据所述音频播放占比和所述视频播放占比,确定应用程序的类别,包括:
    若所述视频播放占比大于所述音频播放占比,则将所述应用程序的类别设定为视频类型;
    若所述音频播放占比大于所述视频播放占比,则将所述应用程序的类别设定为音频类型。
  10. 根据权利要求1至9任一所述的方法,其特征在于,所述若符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示,包括:
    若符合指定类型,则获取所述待播放视频文件的图像大小;
    判断所述图像大小是否大于阈值;
    若大于阈值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
  11. 根据权利要求10所述的方法,其特征在于,所述方法还包括:
    若小于或等于阈值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
  12. 根据权利要求10所述的方法,其特征在于,所述图像大小包括图像数据大小;所述判断所述图像大小是否大于阈值,包括:
    判断所述图像数据大小是否大于阈值;
    若所述图像数据大小大于阈值,则判定所述图像大小大于阈值;
    若所述图像数据大小小于或等于阈值,则判定所述图像大小小于或等于阈值。
  13. 根据权利要求10所述的方法,其特征在于,所述图像大小包括图像分辨率;所述判断所述图像大小是否大于阈值,包括:
    判断所述图像分辨率是否大于指定分辨率;
    若大于指定分辨率,则判定所述图像大小大于阈值;
    若小于或等于指定分辨率,则判定所述图像大小小于或等于阈值。
  14. 根据权利要求1所述的方法,其特征在于,还包括:
    若不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示。
  15. 根据权利要求14所述的方法,其特征在于,所述若不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,包括:
    若不为指定类型,则获取所述中央处理器的使用率;
    判断所述中央处理器的使用率是否小于使用率阈值;
    若小于使用率阈值,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示;
    若大于或等于使用率阈值,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
  16. 根据权利要求14所述的方法,其特征在于,所述若不为指定类型,则控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,包括:
    若不为指定类型,则获取所述中央处理器的使用率;
    判断所述中央处理器的使用率是否小于使用率阈值;
    若小于使用率阈值,则判断当前所开启的应用程序中是否存在与预设应用程序匹 配的应用程序,其中,预设应用程序为允许系统在用户未授权的情况下将应用程序关闭的应用程序;
    若存在,则将与预设应用程序匹配的应用程序的关闭;
    获取所述中央处理器当前的使用率作为所述中央处理器的使用率,并返回执行判断所述中央处理器的使用率是否小于使用率阈值。
  17. 根据权利要求11-16任一所述的方法,其特征在于,所述控制所述中央处理器将所述待播放视频文件处理后在所述屏幕上显示,包括:
    控制所述中央处理器对所述待播放视频文件进行解码处理,获得解码后的多个视频帧;
    调用所述图形处理器对所述多个视频帧进行渲染合成后在所述屏幕上显示。
  18. 一种视频处理装置,其特征在于,应用于电子设备的中央处理器,所述电子设备还包括屏幕和图形处理器,所述视频处理装置包括:
    获取单元,用于获取待播放视频文件的视频类型;
    判断单元,用于判断所述视频类型是否符合指定类型;
    处理单元,用于若符合指定类型,则控制所述图形处理器将所述待播放视频文件处理后在所述屏幕上显示。
  19. 一种电子设备,其特征在于,包括:
    中央处理器和图形处理器;
    存储器;
    屏幕;
    一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述中央处理器执行,所述一个或多个程序配置用于执行如权利要求1-17任一项所述的方法。
  20. 一种计算机可读介质,其特征在于,所述计算机可读取存储介质中存储有程序代码,所述程序代码可被处理器调用执行所述权利要求1-17任一项所述方法。
PCT/CN2019/110000 2018-11-27 2019-10-08 视频处理方法、装置、电子设备和计算机可读介质 WO2020108099A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19891124.0A EP3886445A4 (en) 2018-11-27 2019-10-08 VIDEO PROCESSING METHOD, DEVICE, ELECTRONIC DEVICE, AND COMPUTER READABLE MEDIUM
US17/331,480 US11457272B2 (en) 2018-11-27 2021-05-26 Video processing method, electronic device, and computer-readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811428013.3A CN109587546B (zh) 2018-11-27 2018-11-27 视频处理方法、装置、电子设备和计算机可读介质
CN201811428013.3 2018-11-27

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/331,480 Continuation US11457272B2 (en) 2018-11-27 2021-05-26 Video processing method, electronic device, and computer-readable medium

Publications (1)

Publication Number Publication Date
WO2020108099A1 true WO2020108099A1 (zh) 2020-06-04

Family

ID=65924952

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/110000 WO2020108099A1 (zh) 2018-11-27 2019-10-08 视频处理方法、装置、电子设备和计算机可读介质

Country Status (4)

Country Link
US (1) US11457272B2 (zh)
EP (1) EP3886445A4 (zh)
CN (1) CN109587546B (zh)
WO (1) WO2020108099A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109218802B (zh) * 2018-08-23 2020-09-22 Oppo广东移动通信有限公司 视频处理方法、装置、电子设备及计算机可读介质
CN109587546B (zh) * 2018-11-27 2020-09-22 Oppo广东移动通信有限公司 视频处理方法、装置、电子设备和计算机可读介质
CN110263650B (zh) * 2019-05-22 2022-02-22 北京奇艺世纪科技有限公司 行为类别检测方法、装置、电子设备和计算机可读介质
CN112019847A (zh) * 2019-05-28 2020-12-01 杭州海康威视数字技术股份有限公司 解码方法及电子设备
CN113055681B (zh) * 2021-03-02 2024-02-09 长沙景嘉微电子股份有限公司 视频解码显示方法,装置,电子设备及存储介质
US11765428B2 (en) * 2021-04-07 2023-09-19 Idomoo Ltd System and method to adapting video size
CN113542594B (zh) * 2021-06-28 2023-11-17 惠州Tcl云创科技有限公司 基于视频的高画质图像提取处理方法、装置、移动终端
CN113703943A (zh) * 2021-08-30 2021-11-26 联想(北京)有限公司 数据处理方法、装置及电子设备
CN114390336B (zh) * 2021-12-13 2024-09-17 百度在线网络技术(北京)有限公司 视频的解码方法、装置、电子设备及可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866381A (zh) * 2014-02-20 2015-08-26 联想(北京)有限公司 一种信息处理方法及第一电子设备
CN106407009A (zh) * 2016-09-06 2017-02-15 青岛海信电器股份有限公司 一种图片显示方法和装置
US20170371564A1 (en) * 2016-06-28 2017-12-28 Advanced Micro Devices, Inc. Method and apparatus for memory efficiency improvement by providing burst memory access control
CN108600813A (zh) * 2018-05-17 2018-09-28 上海七牛信息技术有限公司 一种媒体文件播放方法、装置以及媒体播放系统
CN109587546A (zh) * 2018-11-27 2019-04-05 Oppo广东移动通信有限公司 视频处理方法、装置、电子设备和计算机可读介质

Family Cites Families (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2824646B1 (fr) * 2001-05-09 2003-08-15 Canal Plus Technologies Procede de selection d'une image de logiciel executable
EP1657929A1 (fr) * 2004-11-16 2006-05-17 Thomson Licensing Dispositif et méthode de synchronisation de différentes parties d'un service numérique
US20060140265A1 (en) * 2004-12-29 2006-06-29 Adimos Inc. System circuit and method for transmitting media related data
US8015306B2 (en) * 2005-01-05 2011-09-06 Control4 Corporation Method and apparatus for synchronizing playback of streaming media in multiple output devices
US7809452B2 (en) * 2005-02-23 2010-10-05 Leviton Manufacturing Co., Inc. Delay management of presentation output system and method
US8451375B2 (en) * 2005-04-28 2013-05-28 Panasonic Corporation Lip-sync correcting device and lip-sync correcting method
KR100772858B1 (ko) * 2005-08-23 2007-11-02 삼성전자주식회사 시나리오에 따른 제어 방법 및 시스템
JP4794983B2 (ja) * 2005-10-31 2011-10-19 パナソニック株式会社 音声出力システムの制御方法及び音声出力システム
US8607287B2 (en) * 2005-12-29 2013-12-10 United Video Properties, Inc. Interactive media guidance system having multiple devices
JP5049652B2 (ja) * 2006-09-07 2012-10-17 キヤノン株式会社 通信システム、データの再生制御方法、コントローラ、コントローラの制御方法、アダプタ、アダプタの制御方法、およびプログラム
US8351624B2 (en) * 2007-06-18 2013-01-08 Sony Corporation Audio output apparatus, audio input apparatus, audio control apparatus, audio control system, and audio control method
US20080320545A1 (en) * 2007-06-22 2008-12-25 Schwartz Richard T System and method for providing audio-visual programming with alternative content
US8743284B2 (en) * 2007-10-08 2014-06-03 Motorola Mobility Llc Synchronizing remote audio with fixed video
JP5050807B2 (ja) * 2007-11-22 2012-10-17 ソニー株式会社 再生装置、表示装置、再生方法および表示方法
US20090147849A1 (en) 2007-12-07 2009-06-11 The Hong Kong University Of Science And Technology Intra frame encoding using programmable graphics hardware
US8479253B2 (en) * 2007-12-17 2013-07-02 Ati Technologies Ulc Method, apparatus and machine-readable medium for video processing capability communication between a video source device and a video sink device
JP4516156B2 (ja) * 2008-02-06 2010-08-04 パナソニック株式会社 映像音声データ同期方法、映像出力機器、音声出力機器及び映像音声出力システム
US8713157B2 (en) * 2008-11-14 2014-04-29 Interpret, Llc System for collecting computer application usage data of targeted application programs executed on a plurality of client devices
CN102804789B (zh) * 2009-06-23 2015-04-29 Lg电子株式会社 接收系统和提供3d图像的方法
US11277598B2 (en) * 2009-07-14 2022-03-15 Cable Television Laboratories, Inc. Systems and methods for network-based media processing
KR20110023441A (ko) * 2009-08-31 2011-03-08 삼성전자주식회사 이더넷 지원하는 디지털 인터페이스 시스템 및 그 케이블 연결 상태 표시 방법
US8841886B2 (en) * 2009-09-11 2014-09-23 Nxp B.V. Power charging of mobile devices via a HDMI interface
JP5429552B2 (ja) * 2009-12-03 2014-02-26 ソニー株式会社 制御装置及び制御方法、並びに、制御システム
JP2011124925A (ja) * 2009-12-14 2011-06-23 Sony Corp 出力制御装置、出力制御方法、プログラム、及び出力制御システム
JP5466519B2 (ja) * 2010-01-20 2014-04-09 日立コンシューマエレクトロニクス株式会社 情報処理装置、及び情報処理装置の信号処理方法
US8692937B2 (en) * 2010-02-25 2014-04-08 Silicon Image, Inc. Video frame synchronization
JP5533101B2 (ja) * 2010-03-23 2014-06-25 ヤマハ株式会社 オーディオアンプ装置
JP5075279B2 (ja) * 2010-04-20 2012-11-21 パナソニック株式会社 入出力切替装置および入出力切替方法
JP5553695B2 (ja) * 2010-06-30 2014-07-16 キヤノン株式会社 通信装置、及びその制御方法
US8699582B2 (en) * 2010-10-06 2014-04-15 Qualcomm Incorporated Context-based adaptations of video decoder
US9124853B2 (en) * 2010-11-30 2015-09-01 Verizon Patent And Licensing Inc. HDMI device and interoperability testing systems and methods
JP2012119924A (ja) * 2010-11-30 2012-06-21 Toshiba Corp 映像表示装置及び映像表示方法、音声再生装置及び音声再生方法、映像音声同期制御システム
JP2012151725A (ja) * 2011-01-20 2012-08-09 Funai Electric Co Ltd シンク機器
JP2013026859A (ja) * 2011-07-21 2013-02-04 Toshiba Corp 映像表示装置、情報再生方法及び情報再生装置
JP2013085224A (ja) * 2011-07-29 2013-05-09 D & M Holdings Inc コンテンツデータ伝送システム及びコンテンツデータ伝送方法
CN102438230B (zh) * 2011-08-18 2014-08-20 宇龙计算机通信科技(深圳)有限公司 终端和数据业务处理方法
JP2013051532A (ja) * 2011-08-30 2013-03-14 Sony Corp 機器制御装置、機器制御方法およびプログラム
TWI492622B (zh) * 2011-08-31 2015-07-11 Realtek Semiconductor Corp 網路訊號接收系統與網路訊號接收方法
US11178489B2 (en) * 2012-02-02 2021-11-16 Arris Enterprises Llc Audio control module
US8928678B2 (en) * 2012-08-02 2015-01-06 Intel Corporation Media workload scheduler
US8990446B2 (en) * 2012-10-04 2015-03-24 Sony Computer Entertainment America, LLC Method and apparatus for decreasing presentation latency
JP5578229B2 (ja) * 2012-12-19 2014-08-27 ヤマハ株式会社 音響処理装置
US8922713B1 (en) * 2013-04-25 2014-12-30 Amazon Technologies, Inc. Audio and video synchronization
KR102159228B1 (ko) * 2013-09-12 2020-09-23 삼성전자주식회사 오디오 처리 장치, 오디오 처리 방법, 초기화 모드 설정 방법 및 컴퓨터 판독가능 기록매체
KR102140612B1 (ko) * 2013-09-13 2020-08-11 삼성전자주식회사 오디오 신호의 출력을 지연시키는 a/v 수신 장치 및 그 방법, 그리고, a/v 신호 처리 시스템
CN103841389B (zh) * 2014-04-02 2015-10-21 北京奇艺世纪科技有限公司 一种视频播放方法及播放器
EP2953370A1 (en) * 2014-06-05 2015-12-09 Ziggo B.V. Minimizing input lag in a remote GUI TV application
EP3209016A4 (en) * 2014-10-14 2018-02-28 LG Electronics Inc. Method and apparatus for transmitting and receiving data using hdmi
CN104796768A (zh) * 2015-04-17 2015-07-22 苏州健雄职业技术学院 一种基于omap硬解码加速的嵌入式高清媒体播放器及其操作方法
JP6718312B2 (ja) * 2015-09-03 2020-07-08 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 認証方法、通知方法、ソース機器およびシンク機器
US10587852B2 (en) * 2015-10-06 2020-03-10 Lg Electronics Inc. Broadcast signal transmission device, broadcast signal reception device, broadcast signal transmission method, and broadcast signal reception method
CN105916023B (zh) 2016-05-16 2019-04-23 武汉斗鱼网络科技有限公司 一种基于gpu的视频渲染方法和系统
US10142521B2 (en) * 2017-03-22 2018-11-27 Cisco Technology, Inc. System and method for using HDMI arc for estimating latency
CN107483725A (zh) * 2017-07-31 2017-12-15 广东欧珀移动通信有限公司 资源配置方法及相关产品

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866381A (zh) * 2014-02-20 2015-08-26 联想(北京)有限公司 一种信息处理方法及第一电子设备
US20170371564A1 (en) * 2016-06-28 2017-12-28 Advanced Micro Devices, Inc. Method and apparatus for memory efficiency improvement by providing burst memory access control
CN106407009A (zh) * 2016-09-06 2017-02-15 青岛海信电器股份有限公司 一种图片显示方法和装置
CN108600813A (zh) * 2018-05-17 2018-09-28 上海七牛信息技术有限公司 一种媒体文件播放方法、装置以及媒体播放系统
CN109587546A (zh) * 2018-11-27 2019-04-05 Oppo广东移动通信有限公司 视频处理方法、装置、电子设备和计算机可读介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3886445A4 *

Also Published As

Publication number Publication date
EP3886445A1 (en) 2021-09-29
EP3886445A4 (en) 2021-12-01
US20210289259A1 (en) 2021-09-16
CN109587546B (zh) 2020-09-22
CN109587546A (zh) 2019-04-05
US11457272B2 (en) 2022-09-27

Similar Documents

Publication Publication Date Title
WO2020108099A1 (zh) 视频处理方法、装置、电子设备和计算机可读介质
US11418832B2 (en) Video processing method, electronic device and computer-readable storage medium
WO2020038128A1 (zh) 视频处理方法、装置、电子设备及计算机可读介质
WO2021008373A1 (zh) 显示方法、装置、电子设备及计算机可读介质
WO2020038130A1 (zh) 视频处理方法、装置、电子设备及计算机可读介质
US20210287631A1 (en) Video Processing Method, Electronic Device and Storage Medium
WO2021008424A1 (zh) 图像合成方法、装置、电子设备及存储介质
CN109151966B (zh) 终端控制方法、装置、终端设备及存储介质
KR102059219B1 (ko) 컴퓨터 시스템과 관련된 하드웨어 자원의 가상화를 위한 방법, 시스템 및 실행가능한 코드 조각
EP2756481B1 (en) System and method for layering using tile-based renderers
US11403121B2 (en) Streaming per-pixel transparency information using transparency-agnostic video codecs
WO2018133800A1 (zh) 视频画面处理方法、装置、电子设备及存储介质
US10957285B2 (en) Method and system for playing multimedia data
US20190012044A1 (en) Detecting content types and window regions in composited desktop frame buffer
KR20130076878A (ko) 중첩된 주석 출력
CN109361950B (zh) 视频处理方法、装置、电子设备及存储介质
WO2024061180A1 (zh) 云桌面系统与云桌面显示方法、终端设备及存储介质
US20160275917A1 (en) Predictive pre-decoding of encoded media item
CN109587555B (zh) 视频处理方法、装置、电子设备及存储介质
CN109688462B (zh) 降低设备功耗的方法、装置、电子设备及存储介质
CN109587561B (zh) 视频处理方法、装置、电子设备及存储介质
WO2020107992A1 (zh) 视频处理方法、装置、电子设备及存储介质
US12052447B1 (en) Dynamically moving transcoding of content between servers
TW201903615A (zh) 資料處理方法、分配方法、電子設備、用戶端和儲存媒介
WO2023087827A9 (zh) 渲染方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19891124

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019891124

Country of ref document: EP

Effective date: 20210624