CN112954376A - Video playing method and display equipment - Google Patents

Video playing method and display equipment Download PDF

Info

Publication number
CN112954376A
CN112954376A CN202110154742.XA CN202110154742A CN112954376A CN 112954376 A CN112954376 A CN 112954376A CN 202110154742 A CN202110154742 A CN 202110154742A CN 112954376 A CN112954376 A CN 112954376A
Authority
CN
China
Prior art keywords
data
audio
current frame
video
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110154742.XA
Other languages
Chinese (zh)
Inventor
汪国健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Hisense Electronic Technology Wuhan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Wuhan Co ltd filed Critical Hisense Electronic Technology Wuhan Co ltd
Priority to CN202110154742.XA priority Critical patent/CN112954376A/en
Publication of CN112954376A publication Critical patent/CN112954376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47202End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand

Abstract

The invention discloses a video playing method and display equipment.A first audio and video data of a film source is downloaded through a network when a watching operation of the film source is received, a second audio and video data is obtained after the first audio and video data is analyzed, and the second audio and video data is added into a data buffer queue; before the current frame is played, acquiring audio and video data of the current frame from a data buffer queue, counting the current data volume of the data buffer queue, and calculating the target speed multiplier of the current frame according to the current data volume; and synchronously playing the video data and the audio data of the current frame according to the target double speed. According to the method and the device, before each frame is played, the target speed is acquired in a self-adaptive mode according to the current data volume of the data buffer queue, the relative balance between playing and caching is achieved, the smooth playing of the film source is guaranteed on the first aspect, the memory is prevented from being excessively occupied on the second aspect, the video resolution is not required to be frequently adjusted on the third aspect, the video is guaranteed to have uniform high definition, and therefore better impression is provided for users.

Description

Video playing method and display equipment
Technical Field
The present invention relates to the field of display devices, and in particular, to a video playing method and a display device.
Background
The display device is generally provided with a multimedia application, a user starts the multimedia application and selects a film source to be watched, the display device downloads and plays audio and video data of the film source, and the fluency of video playing is one of factors influencing the appearance of the user. When a network shakes due to instability, a video playing is caused to have a pause phenomenon, and at present, a self-adaptive streaming media mode is mostly adopted to overcome the problem that the video playing is not smooth, namely, a film source with matched resolution is selected to be downloaded in a self-adaptive mode according to different network bandwidths, but in a network shaking state, the resolution of a video is frequently switched, for example, a standard definition mode (resolution 270P) is adopted when the network bandwidth is low, and an ultra-definition mode (resolution 720P) is adjusted when the network bandwidth is high, so that the video playing does not have consistent resolution, and a user feels poor.
Disclosure of Invention
The invention provides a video playing method and display equipment, which can adaptively adjust the video playing speed according to a network environment, ensure the smoothness of video playing, avoid frequent resolution switching and improve the impression of a user.
An embodiment of a first aspect of the present application provides a display device, including:
a display for playing video data of the film source;
the sound player is used for playing the audio data of the film source;
a controller configured to perform:
when receiving a watching operation of a film source, downloading first audio and video data of the film source through a network, analyzing the first audio and video data to obtain second audio and video data, and adding the second audio and video data into a data buffer queue;
before playing the current frame, acquiring audio and video data of the current frame from the data buffer queue, counting the current data volume of the data buffer queue, and calculating the target speed multiplication of the current frame according to the current data volume;
and controlling the display to play the video data of the current frame according to the target speed, and controlling the sound player to synchronously play the audio data of the current frame according to the target speed.
In some embodiments, the controller is configured to calculate the target speed of the current frame as follows: when the current data volume T is smaller than a lower threshold value TminWhen the target speed is T/Tmin
In some embodiments, the controller is configured to calculate the target speed of the current frame as follows: when the current data volume T is larger than an upper limit threshold value TmaxWhen the target speed is T/Tmax
In some embodiments, the controller is configured to calculate the target speed of the current frame as follows: at a lower threshold TminThe current data volume T is less than or equal to the upper limit threshold value TmaxThen, the target speed S is 1.
In some embodiments, the controller is further configured to perform:
sequencing the frames added into the data buffer queue according to the sequence of displaying the timestamps;
and after the audio and video data of the current frame are acquired, removing the audio and video data of the current frame from the data buffer queue.
An embodiment of a second aspect of the present application provides a video playing method, including:
when receiving a watching operation of a film source, downloading first audio and video data of the film source through a network, analyzing the first audio and video data to obtain second audio and video data, and adding the second audio and video data into a data buffer queue;
before playing the current frame, acquiring audio and video data of the current frame from the data buffer queue, counting the current data volume of the data buffer queue, and calculating the target speed multiplication of the current frame according to the current data volume;
and synchronously playing the video data and the audio data of the current frame according to the target double speed.
According to the technical scheme, when a user clicks a film source to be watched, first audio and video data of the film source are downloaded through a network, the first audio and video data are analyzed, second audio and video data are obtained after the first audio and video data are analyzed, the second audio and video data comprise a plurality of frames of audio and video data, the second audio and video data are added into a data buffer queue, and then before each frame is played, the audio and video data of the corresponding frame need to be called from the data buffer queue, and the current data volume of the data buffer queue is counted. The state of the network bandwidth can be analyzed through the current data volume, if the network bandwidth is large, the data downloading is fast, and the downloaded data volume cached in the queue is large; on the contrary, if the network bandwidth is small, the data downloading is slow, and the downloaded data amount buffered in the queue is naturally small. When the data downloading is slow, if the film source is still played at normal speed, the situation that the buffer storage cannot supply the playing amount may occur, so that the buffered data is exhausted quickly, and the video picture is stopped; when the data is downloaded faster, if the film source is still played at normal speed, the buffer amount exceeds the playing amount, which causes the cache data to be quickly accumulated and occupies a larger memory. Therefore, if the relative balance between the playing amount and the buffer amount cannot be obtained, the video playing inevitably has the defects. Therefore, in the application, before each frame is played, the target speed is acquired in a self-adaptive manner according to the current data volume of the data buffer queue, so that when audio and video data of each frame are played according to the target speed, the relative balance between data playing and caching is achieved, the fluency of film source playing can be guaranteed on the first aspect, excessive occupation of a memory is avoided on the second aspect, the resolution of video playing does not need to be frequently adjusted on the third aspect, the video is guaranteed to have uniform high definition, better impression is provided for a user, and user experience and satisfaction are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings to be accessed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 illustrates a usage scenario of a display device according to some embodiments;
fig. 2 illustrates a hardware configuration block diagram of the control apparatus 100 according to some embodiments;
fig. 3 illustrates a hardware configuration block diagram of the display apparatus 200 according to some embodiments;
FIG. 4 illustrates a software configuration diagram in the display device 200 according to some embodiments;
fig. 5 illustrates a flow chart of a video playing method.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of a usage scenario of a display device according to an embodiment. As shown in fig. 1, the display apparatus 200 is also in data communication with a server 400, and a user can operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes at least one of an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may control the display apparatus 200 by inputting a user instruction through at least one of a key on a remote controller, a voice input, a control panel input, and the like.
In some embodiments, the smart device 300 may include any of a mobile terminal, a tablet, a computer, a laptop, an AR/VR device, and the like.
In some embodiments, the smart device 300 may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the smart device 300 and the display device may also be used for communication of data.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice instruction control of the user may be directly received by a module configured inside the display device 200 to obtain a voice instruction, or may be received by a voice control apparatus provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
In some embodiments, software steps executed by one step execution agent may be migrated on demand to another step execution agent in data communication therewith for execution. Illustratively, software steps performed by the server may be migrated to be performed on a display device in data communication therewith, and vice versa, as desired.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
In some embodiments, the communication interface 130 is used for external communication, and includes at least one of a WIFI chip, a bluetooth module, NFC, or an alternative module.
In some embodiments, the user input/output interface 140 includes at least one of a microphone, a touchpad, a sensor, a key, or an alternative module.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a central processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, displaying video content, image content, and menu manipulation interface, and a user manipulation UI interface, etc.
In some embodiments, the display 260 may be at least one of a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A CPU processor. For executing operating system and application program instructions stored in the memory, and executing various application programs, data and contents according to various interactive instructions receiving external input, so as to finally display and play various audio-video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: at least one of an icon, an operation menu, and a user input instruction display figure. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal displayed or played on the direct display device 200.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include at least one of an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc. visual interface elements.
In some embodiments, user interface 280 is an interface that may be used to receive control inputs (e.g., physical buttons on the body of the display device, or the like).
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
As shown in fig. 4, the system of the display device is divided into three layers, i.e., an application layer, a middleware layer and a hardware layer from top to bottom.
The Application layer mainly includes common applications on the television and an Application Framework (Application Framework), wherein the common applications are mainly applications developed based on the Browser, such as: HTML5 APPs; and Native APPs (Native APPs);
an Application Framework (Application Framework) is a complete program model, and has all basic functions required by standard Application software, such as: file access, data exchange, and interfaces to use these functions (toolbars, status lists, menus, dialog boxes).
Native APPs (Native APPs) may support online or offline, message push, or local resource access.
The middleware layer comprises various television protocols, multimedia protocols, system components and other middleware. The middleware can use basic service (function) provided by system software to connect each part of an application system or different applications on a network, and can achieve the purposes of resource sharing and function sharing.
The hardware layer mainly comprises an HAL interface, hardware and a driver, wherein the HAL interface is a unified interface for butting all the television chips, and specific logic is realized by each chip. The driving mainly comprises: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The above embodiments describe the hardware/software architecture and functional implementation of the display device. In some application scenes, a display device can be provided with a multimedia application, the multimedia application is provided with a large number of film sources for a user to select to watch, the user selects the film source to be watched after starting the multimedia application, the display device downloads first audio and video data (not analyzed) of the film source, then analyzes the audio and video data to obtain second audio and video data, the second audio and video data comprises audio and video frame data of a frame after analysis, the second audio and video data is added into a data buffer queue to be buffered, and the subsequent decoding and rendering display are waited.
In the parsing process, information such as a frame rate (frame/second), a play duration (duration) of each frame, a DTS (Decode Time Stamp), a PTS (Presentation Time Stamp), and the like may be acquired, audio and video frame data is acquired from the queue when a Time specified by the DTS is reached, the frame is decoded, the frame is rendered and displayed when the Time specified by the PTS is reached, and the video can be displayed by the display after being decoded and rendered. And continuously playing the audio and video frame data according to the frame rate and the playing duration time, so that the film source playing can be realized.
In the research process, the applicant finds that in practical application, a network may jitter due to instability, that is, the network bandwidth fluctuates, if the network bandwidth is large, data is downloaded quickly, and the downloaded data amount cached in a queue is also large; on the contrary, if the network bandwidth is small, the data downloading is slow, and the downloaded data amount buffered in the queue is naturally small. When data downloading is slow, if a film source is still played at a normal speed (namely, at a double speed), the situation that the buffer storage amount cannot supply the playing amount may occur, which is similar to 'supply shortage', so that the buffered data is exhausted quickly, the video picture is stopped, and the problem of unsmooth playing is caused; when the data is downloaded faster, if the film source is still played at normal speed, the buffer amount exceeds the playing amount, which is similar to "over supply and over demand", so that the buffer data is quickly accumulated, and the memory is occupied more. When the playing end and the buffer area realize a relative balance, the fluency problem of video playing can be better solved, and the stability of the water level can be kept without being too high or too low similarly to the way that the input and the output of water flow keep balance.
On the basis of the conventional playing method of the film source audio/video, in some embodiments, the video playing processing mechanism provided by the application is as follows: before each frame is played, counting the current data volume T of the data buffer queue, and comparing the current data volume T with a preset upper limit threshold value TmaxAnd a lower threshold value TminAnd comparing to measure the current network state. The current data volume T is less than the lower threshold value TminWhen the video is played, the network bandwidth is low, the network speed is low, and the playing speed (slow playing) of the bass video can be adaptively adjusted in order to avoid unsmooth playing caused by fast exhaustion of the cached data, namely the playing speed is less than 1; the current data volume T is larger than the upper threshold value TmaxWhen the video is played, the network bandwidth is high, the network speed is high, and the playing speed (fast playing) of the high-volume video can be adaptively adjusted to avoid the overlarge memory occupation caused by the fast accumulation of the cache data, namely the playing speed is more than 1; if the current data volume T is in the lower threshold value TminUpper threshold value Tmax]Within the interval range, it is indicated that the playing and buffering are kept in a relatively good balance, which is an ideal state, and the audio and video can be played at a normal speed without performing a speed regulation process of slow playing or fast playing, i.e. the playing speed is equal to 1.
In some embodiments, fig. 5 shows a video playback method, the execution subject of which is a controller 250 in a display device, the method comprising the program steps of:
step S10, when receiving the viewing operation on the film source, downloading the first audio/video data of the film source through the network, analyzing the first audio/video data to obtain second audio/video data, and adding the second audio/video data into the data buffer queue.
Step S20, before playing the current frame, obtaining the audio/video data of the current frame from the data buffer queue, counting the current data amount of the data buffer queue, and calculating the target multiple speed of the current frame according to the current data amount.
In some embodiments, ideally the data buffers the queue to buffer T1Data volume in seconds is optimal, T1An optimum reference time selected according to the actual situation, e.g. T1The data size can meet the requirement of smoothness of playing the film source, and meanwhile, the memory occupation is not too large. T is1Data volume in seconds T1Frateriate, Frateriate is the frame rate in frames/second, such as T1When the frame rate is 20 frames/second, the reference amount of time corresponds to 40 frames of data. Since the frame rate of the slice source is generally fixed, the amount of time is linear with the amount of data, and thus the amount of time can be used to measure the amount of data in this embodiment.
In some embodiments, the reference amount of time T may be based on1To set an upper threshold value TmaxAnd a lower threshold value TminLower threshold value Tmin=T1A, upper threshold Tmax=b*T1Wherein a and b are lower limit proportionality coefficients, b is an upper limit proportionality coefficient, and a and b may be the same or different when T is1A and b are selected according to practical application, and then the upper threshold value T is setmaxAnd a lower threshold value TminThe presetting can be completed.
In some embodiments, before each frame is played, the current data amount T of the data buffer queue is counted, and the duration T ∑ duration is the playing duration of each frame, and the current data may be determined by cumulatively summing the playing durations of the frames in the data buffer queueThe data buffered by the buffer queue can support the playing of the clip for a long time. By comparing the current data amount T with the upper threshold TmaxAnd a lower threshold value TminAnd comparing to measure the network speed condition, thereby adaptively adjusting the playing speed of the audio and video of the film source, namely the target speed. The multiple speed is a multiple of the normal speed, for example, the double speed is to perform fast playing according to twice the normal speed, and the 0.5 multiple speed is to perform slow playing according to half the normal speed.
In some embodiments, referring to fig. 5, in step S201, when the current data amount T is less than the lower threshold TminWhen the target speed is T/Tmin. Under the condition, S is less than 1, the playing speed of the video and the audio of the film source is adaptively lowered according to the network and the queue buffer state, namely, the playing and the buffer of the data are adjusted to be in a balanced state through slow playing, the smoothness of playing the film source is ensured, and meanwhile, the reasonability of memory occupation is considered.
In some embodiments, referring to FIG. 5, step S202, at a lower threshold TminThe current data volume T is less than or equal to the upper limit threshold value TmaxWhen the target speed S is 1. Under the condition, the playing and the caching of the data are kept in a better balance state, so that the audio and video of the film source can be directly played at a normal speed without the speed regulation processing of fast playing or slow playing.
In some embodiments, referring to fig. 5, in step S203, when the current data amount T is greater than the upper threshold TmaxWhen the target speed is T/Tmax. Under the condition, S is larger than 1, the playing speed of the film source audio and video is adaptively increased according to the network and the queue buffer state, namely, the playing and the buffer of the data are adjusted to be in a relatively balanced state through fast playing, the memory is prevented from being excessively occupied, and the smoothness of the film source playing is ensured.
Step S30, the display is controlled to play the video data of the current frame at the target multiple speed, and the sound player is controlled to synchronously play the audio data of the current frame at the target multiple speed. The video data and the audio data of the current frame are played synchronously according to the same target multiple speed, so that the sound and picture synchronization is ensured, and the optimal film source impression is provided for users.
In some embodiments, when the second audio/video data is added to the data buffer queue, each frame in the second audio/video data has a corresponding display timestamp (i.e. PTS), so that the frames added to the data buffer queue may be sorted according to the time sequence of the PTS, for example, the earlier the PTS time of the frame is, the earlier the frame is sorted, so that the audio/video frame data may be obtained according to the display time sequence and played. The data buffer queue is not limited in the manner of sorting.
In some embodiments, after a current frame to be played is determined according to the time indications of the PTS and the DTS, before the current frame is played, audio and video data of the current frame needs to be obtained in advance from the data buffer queue, and processing such as decoding and rendering is performed, and at the time when the PTS is reached, the audio and video data of the current frame are synchronously played at the same target multiple speed, and after the audio and video data of the current frame are obtained, the audio and video data of the current frame can be removed from the data buffer queue, so that the played frame data cannot be retained in the queue, and memory consumption is reduced. The management mode of the data buffer queue may be specifically set according to the actual application, and is not limited to the embodiment of the present application.
According to the technical scheme, before each frame is played, the target speed is acquired in a self-adaptive mode according to the current data volume of the data buffer queue, when the playing speed of each frame of audio and video data is adjusted according to the target speed, the relative balance between data playing and caching is achieved, so that the fluency of film source playing can be guaranteed on the first aspect, excessive occupation of a memory is avoided on the second aspect, the resolution of video playing does not need to be adjusted frequently on the third aspect, the video is guaranteed to have uniform high definition, better impression is provided for users, and user experience and satisfaction are improved.
Those skilled in the art will readily appreciate that the techniques of the embodiments of the present invention may be implemented as software plus a required general purpose hardware platform. In a specific implementation, the invention also provides a computer storage medium, which can store a program. When the computer storage medium is located in a display device, the program when executed may include program steps involved in a video playback method that the controller 250 is configured to perform. The computer storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM) or a Random Access Memory (RAM).
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display for playing video data of the film source;
the sound player is used for playing the audio data of the film source;
a controller configured to perform:
when receiving a watching operation of a film source, downloading first audio and video data of the film source through a network, analyzing the first audio and video data to obtain second audio and video data, and adding the second audio and video data into a data buffer queue;
before playing the current frame, acquiring audio and video data of the current frame from the data buffer queue, counting the current data volume of the data buffer queue, and calculating the target speed multiplication of the current frame according to the current data volume;
and controlling the display to play the video data of the current frame according to the target speed, and controlling the sound player to synchronously play the audio data of the current frame according to the target speed.
2. The display device according to claim 1, wherein the controller is configured to calculate the target double speed of the current frame as follows:
when the current data volume T is smaller than a lower threshold value TminWhen the target speed is T/Tmin
3. The display device according to claim 1, wherein the controller is configured to calculate the target double speed of the current frame as follows:
when the current data volume T is larger than an upper limit threshold value TmaxWhen the target speed is T/Tmax
4. The display device according to claim 1, wherein the controller is configured to calculate the target double speed of the current frame as follows:
at a lower threshold TminThe current data volume T is less than or equal to the upper limit threshold value TmaxThen, the target speed S is 1.
5. The display device according to claim 1, wherein the controller is further configured to perform:
sequencing the frames added into the data buffer queue according to the sequence of displaying the timestamps;
and after the audio and video data of the current frame are acquired, removing the audio and video data of the current frame from the data buffer queue.
6. A video playback method in a display device, comprising:
when receiving a watching operation of a film source, downloading first audio and video data of the film source through a network, analyzing the first audio and video data to obtain second audio and video data, and adding the second audio and video data into a data buffer queue;
before playing the current frame, acquiring audio and video data of the current frame from the data buffer queue, counting the current data volume of the data buffer queue, and calculating the target speed multiplication of the current frame according to the current data volume;
and synchronously playing the video data and the audio data of the current frame according to the target double speed.
7. The method of claim 6, wherein the target speed multiplier for the current frame is calculated as follows:
when the current data volume T is smaller than a lower threshold value TminWhen the target speed is T/Tmin
8. The method of claim 6, wherein the target speed multiplier for the current frame is calculated as follows:
when the current data volume T is larger than an upper limit threshold value TmaxWhen the target speed is T/Tmax
9. The method of claim 6, wherein the target speed multiplier for the current frame is calculated as follows:
at a lower threshold TminThe current data volume T is less than or equal to the upper limit threshold value TmaxThen, the target speed S is 1.
10. The method of claim 6, further comprising:
sequencing the frames added into the data buffer queue according to the sequence of displaying the timestamps;
and after the audio and video data of the current frame are acquired, removing the audio and video data of the current frame from the data buffer queue.
CN202110154742.XA 2021-02-04 2021-02-04 Video playing method and display equipment Pending CN112954376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110154742.XA CN112954376A (en) 2021-02-04 2021-02-04 Video playing method and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110154742.XA CN112954376A (en) 2021-02-04 2021-02-04 Video playing method and display equipment

Publications (1)

Publication Number Publication Date
CN112954376A true CN112954376A (en) 2021-06-11

Family

ID=76243831

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110154742.XA Pending CN112954376A (en) 2021-02-04 2021-02-04 Video playing method and display equipment

Country Status (1)

Country Link
CN (1) CN112954376A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113453052A (en) * 2021-06-22 2021-09-28 海信视像科技股份有限公司 Sound and picture synchronization method and display device
CN114245180A (en) * 2022-01-04 2022-03-25 海信视像科技股份有限公司 Display device, video data transmission method, and storage medium
CN115022667A (en) * 2022-06-28 2022-09-06 武汉斗鱼鱼乐网络科技有限公司 Live broadcast control method and related equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102271280A (en) * 2011-07-20 2011-12-07 宝利微电子系统控股公司 Method and apparatus for variable speed playing of digital audio and video
CN108810656A (en) * 2018-06-12 2018-11-13 深圳国微视安科技有限公司 A kind of the debounce processing method and processing system of real-time live broadcast TS streams
CN109963184A (en) * 2017-12-14 2019-07-02 阿里巴巴集团控股有限公司 A kind of method, apparatus and electronic equipment of audio-video network broadcasting
CN110248204A (en) * 2019-07-16 2019-09-17 广州虎牙科技有限公司 A kind of processing method, device, equipment and the storage medium of live streaming caching
CN111918093A (en) * 2020-08-13 2020-11-10 腾讯科技(深圳)有限公司 Live broadcast data processing method and device, computer equipment and storage medium
CN112153446A (en) * 2020-09-27 2020-12-29 海信视像科技股份有限公司 Display equipment and streaming media video audio-video synchronization method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102271280A (en) * 2011-07-20 2011-12-07 宝利微电子系统控股公司 Method and apparatus for variable speed playing of digital audio and video
CN109963184A (en) * 2017-12-14 2019-07-02 阿里巴巴集团控股有限公司 A kind of method, apparatus and electronic equipment of audio-video network broadcasting
CN108810656A (en) * 2018-06-12 2018-11-13 深圳国微视安科技有限公司 A kind of the debounce processing method and processing system of real-time live broadcast TS streams
CN110248204A (en) * 2019-07-16 2019-09-17 广州虎牙科技有限公司 A kind of processing method, device, equipment and the storage medium of live streaming caching
CN111918093A (en) * 2020-08-13 2020-11-10 腾讯科技(深圳)有限公司 Live broadcast data processing method and device, computer equipment and storage medium
CN112153446A (en) * 2020-09-27 2020-12-29 海信视像科技股份有限公司 Display equipment and streaming media video audio-video synchronization method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113453052A (en) * 2021-06-22 2021-09-28 海信视像科技股份有限公司 Sound and picture synchronization method and display device
CN113453052B (en) * 2021-06-22 2022-11-08 Vidaa(荷兰)国际控股有限公司 Sound and picture synchronization method and display device
CN114245180A (en) * 2022-01-04 2022-03-25 海信视像科技股份有限公司 Display device, video data transmission method, and storage medium
CN115022667A (en) * 2022-06-28 2022-09-06 武汉斗鱼鱼乐网络科技有限公司 Live broadcast control method and related equipment

Similar Documents

Publication Publication Date Title
CN114302219B (en) Display equipment and variable frame rate display method
CN112954376A (en) Video playing method and display equipment
CN112153446B (en) Display device and streaming media video audio and video synchronization method
WO2020098504A1 (en) Video switching control method and display device
CN111836104B (en) Display apparatus and display method
CN112153406A (en) Live broadcast data generation method, display equipment and server
CN113542851A (en) Menu refreshing method and display device
WO2023077991A1 (en) Display device and display method thereof
CN113630654A (en) Display device and media resource pushing method
CN113111214A (en) Display method and display equipment for playing records
CN114095769A (en) Live broadcast low-delay processing method of application-level player and display equipment
CN112601109A (en) Audio playing method and display device
CN112203154A (en) Display device
CN112272331A (en) Method for rapidly displaying program channel list and display equipment
CN113453052B (en) Sound and picture synchronization method and display device
CN113453069B (en) Display device and thumbnail generation method
CN113038221B (en) Double-channel video playing method and display equipment
CN113350781B (en) Display device and game mode switching method
CN112135173B (en) Method for improving play-starting code rate of streaming media and display equipment
CN113709557B (en) Audio output control method and display device
CN115119029B (en) Display equipment and display control method
CN114040258B (en) Display method and display equipment for switching digital television program from time shift to recording
CN115174991B (en) Display equipment and video playing method
CN116939295A (en) Display equipment and method for dynamically adjusting utilization rate of controller
CN117119234A (en) Display equipment and media asset playing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20221024

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: No.02-2, 13th floor, building B2, phase 4.1, software industry, No.1, East Road, Donghu New Technology Development Zone, Wuhan, Hubei Province

Applicant before: Hisense Electronic Technology (Wuhan) Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210611