CN113378092A - Video playing management method and display equipment - Google Patents

Video playing management method and display equipment Download PDF

Info

Publication number
CN113378092A
CN113378092A CN202110691835.6A CN202110691835A CN113378092A CN 113378092 A CN113378092 A CN 113378092A CN 202110691835 A CN202110691835 A CN 202110691835A CN 113378092 A CN113378092 A CN 113378092A
Authority
CN
China
Prior art keywords
video
playing
video object
display
label page
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110691835.6A
Other languages
Chinese (zh)
Inventor
温佳乐
罗一龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electronic Technology Shenzhen Co ltd
Original Assignee
Hisense Electronic Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Shenzhen Co ltd filed Critical Hisense Electronic Technology Shenzhen Co ltd
Priority to CN202110691835.6A priority Critical patent/CN113378092A/en
Publication of CN113378092A publication Critical patent/CN113378092A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application discloses a video playing management method and display equipment, which are used for analyzing an appointed label page to obtain each video element existing in the appointed label page by responding to the opening of the appointed label page, creating a corresponding video object for each video element, and creating a video resource required by the video playing for each video object, so that when the appointed label page is switched to a background, the current playing state of the video corresponding to each video object is recorded, the video resource corresponding to each video object is released, the management of the video playing in the appointed label page is realized, the problem that a browser card is caused by the fact that the resources occupied by the video playing are excessively occupied is solved, and the black screen caused by the fact that the resources occupied by the video playing are directly released is avoided.

Description

Video playing management method and display equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a video playing management method and a display device.
Background
In the related art, after a video in a tab page is opened in a browser, when the tab page is switched to another tab page, or when the tab page is switched to a background by switching the browser to the background, a video being played in the tab page is not paused, so that when a plurality of tab pages are opened in the browser, the video playing occupies more resources, which will cause that the browser is stuck and cannot respond to the operation, and when the resources occupied by the video playing are large, the tab page is switched to another tab page or the tab page is switched to the background, if the resources occupied by the video playing are directly released, the problem of a black screen is easily caused.
Disclosure of Invention
The application discloses a video playing management method and display equipment, which are used for avoiding the problem that a browser is blocked due to the fact that resources occupied by video playing are too much, and avoiding the phenomenon that resources occupied by video playing are directly released to cause a black screen.
According to a first aspect of embodiments of the present application, there is provided a display apparatus including:
a display;
a controller coupled with the display and configured to:
responding to the opening of the specified label page, analyzing the specified label page to obtain each video element existing in the specified label page, creating a corresponding video object for each video element, and creating a video resource required by the video playing for each video object;
and responding to the switching of the specified label page to a background, recording the current playing state of the video corresponding to each video object, and releasing the video resource corresponding to each video object.
According to a second aspect of the embodiments of the present application, there is provided a video playback management method, including:
responding to the opening of the specified label page, analyzing the specified label page to obtain each video element existing in the specified label page, creating a corresponding video object for each video element, and creating a video resource required by the video playing for each video object;
and responding to the switching of the specified label page to a background, recording the current playing state of the video corresponding to each video object, and releasing the video resource corresponding to each video object.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
according to the technical scheme, in the scheme provided by the application, each video element existing in the designated tab page can be obtained by analyzing the designated tab page in response to opening the designated tab page, the corresponding video object is created for each video element, and the video resource required by video playing is created for each video object, so that when the designated tab page is switched to a background, the current playing state of the video corresponding to each video object is recorded, and the video resource corresponding to each video object is released, so that the management of video playing in the designated tab page is realized, the problem of browser jamming caused by excessive resource occupied by the video playing is reduced, and the black screen caused by direct release of the resource occupied by the video playing is avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments consistent with the present application and together with the application, serve to explain the principles of the application.
Fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100;
fig. 1B is a block diagram schematically illustrating a configuration of the control apparatus 100 in fig. 1A;
fig. 1C is a block diagram schematically illustrating a configuration of the display device 200 in fig. 1A;
FIG. 1D is a block diagram illustrating an architectural configuration of an operating system in memory of display device 200;
FIG. 2 is a schematic diagram of a process for responding to opening a designated tab page according to an embodiment of the present application;
FIG. 3 is a schematic flowchart illustrating a process for switching the designated tab page to the background according to an embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating a process for responding to a switch of a designated tab page from a background to a foreground according to an embodiment of the present application;
FIG. 5 is a page diagram illustrating a single video appearing on a single tab page in accordance with an embodiment of the present application;
FIG. 6 is a schematic diagram of a single tab page with multiple video's appearing therein according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating a relationship between playing states of video according to an embodiment of the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without making any inventive step, are within the scope of protection of the present application. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in this application are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to define a particular order or sequence Unless otherwise noted (Unless thermally induced). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
In order to facilitate understanding of the embodiments provided in the present application, the following first describes the structure of the display device and the interaction process between the display device and the control device:
fig. 1A is a schematic diagram illustrating an operation scenario between the display device 200 and the control apparatus 100. As shown in fig. 1A, the control apparatus 100 and the display device 200 may communicate with each other in a wired or wireless manner.
Among them, the control apparatus 100 is configured to control the display device 200, which may receive an operation instruction input by a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an intermediary for interaction between the user and the display device 200. Such as: the user operates the channel up/down key on the control device 100, and the display device 200 responds to the channel up/down operation.
The control device 100 may be a remote controller 100A, which includes infrared protocol communication or bluetooth protocol communication, and other short-distance communication methods, etc. to control the display apparatus 200 in a wireless or other wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
The control device 100 may also be an intelligent device, such as a mobile terminal 100B, a tablet computer, a notebook computer, and the like. For example, the display device 200 is controlled using an application program running on the smart device. The application program may provide various controls to a user through an intuitive User Interface (UI) on a screen associated with the smart device through configuration.
For example, the mobile terminal 100B may install a software application with the display device 200 to implement connection communication through a network communication protocol for the purpose of one-to-one control operation and data communication. Such as: the mobile terminal 100B may be caused to establish a control instruction protocol with the display device 200 to implement the functions of the physical keys as arranged in the remote control 100A by operating various function keys or virtual buttons of the user interface provided on the mobile terminal 100B. The audio and video content displayed on the mobile terminal 100B may also be transmitted to the display device 200, so as to implement a synchronous display function.
The display apparatus 200 may be implemented as a television, and may provide an intelligent network television function of a broadcast receiving television function as well as a computer support function. Examples of the display device include a digital television, a web television, a smart television, an Internet Protocol Television (IPTV), and the like.
The display device 200 may be a liquid crystal display, an organic light emitting display, a projection display device. The specific display device type, size, resolution, etc. are not limited.
The display apparatus 200 also performs data communication with the server 300 through various communication means. Here, the display apparatus 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 300 may provide various contents and interactions to the display apparatus 200. By way of example, the display device 200 may send and receive information such as: receiving Electronic Program Guide (EPG) data, receiving software program updates, or accessing a remotely stored digital media library. The servers 300 may be a group or groups of servers, and may be one or more types of servers. Other web service contents such as a video on demand and an advertisement service are provided through the server 300.
Fig. 1B is a block diagram illustrating the configuration of the control device 100. As shown in fig. 1B, the control device 100 includes a controller 110, a memory 120, a communicator 130, a user input interface 140, an output interface 150, and a power supply 160.
The controller 110 includes a Random Access Memory (RAM)111, a Read Only Memory (ROM)112, a processor 113, a communication interface, and a communication bus. The controller 110 is used to control the operation of the control device 100, as well as the internal components of the communication cooperation, external and internal data processing functions.
Illustratively, when an interaction of a user pressing a key disposed on the remote controller 100A or an interaction of touching a touch panel disposed on the remote controller 100A is detected, the controller 110 may control to generate a signal corresponding to the detected interaction and transmit the signal to the display device 200.
And a memory 120 for storing various operation programs, data and applications for driving and controlling the control apparatus 100 under the control of the controller 110. The memory 120 may store various control signal commands input by a user.
The communicator 130 enables communication of control signals and data signals with the display apparatus 200 under the control of the controller 110. Such as: the control apparatus 100 transmits a control signal (e.g., a touch signal or a button signal) to the display device 200 via the communicator 130, and the control apparatus 100 may receive the signal transmitted by the display device 200 via the communicator 130. The communicator 130 may include an infrared signal interface 131 and a radio frequency signal interface 132. For example: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to the rf control signal modulation protocol and then transmitted to the display device 200 through the rf transmitting terminal.
The user input interface 140 may include at least one of a microphone 141, a touch pad 142, a sensor 143, a key 144, and the like, so that a user can input a user instruction regarding controlling the display apparatus 200 to the control apparatus 100 through voice, touch, gesture, press, and the like.
The output interface 150 outputs a user instruction received by the user input interface 140 to the display apparatus 200, or outputs an image or voice signal received by the display apparatus 200. Here, the output interface 150 may include an LED interface 151, a vibration interface 152 generating vibration, a sound output interface 153 outputting sound, a display 154 outputting an image, and the like. For example, the remote controller 100A may receive an output signal such as audio, video, or data from the output interface 150, and display the output signal in the form of an image on the display 154, in the form of audio on the sound output interface 153, or in the form of vibration on the vibration interface 152.
And a power supply 160 for providing operation power support for each element of the control device 100 under the control of the controller 110. In the form of a battery and associated control circuitry.
A hardware configuration block diagram of the display device 200 is exemplarily illustrated in fig. 1C. As shown in fig. 1C, the display apparatus 200 may further include a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a memory 260, a user interface 265, a video processor 270, a display 275, an audio processor 280, an audio input interface 285, and a power supply 290.
The tuner demodulator 210 receives the broadcast television signal in a wired or wireless manner, may perform modulation and demodulation processing such as amplification, mixing, and resonance, and is configured to demodulate, from a plurality of wireless or wired broadcast television signals, an audio/video signal carried in a frequency of a television channel selected by a user, and additional information (e.g., EPG data).
The tuner demodulator 210 is responsive to the user selected frequency of the television channel and the television signal carried by the frequency, as selected by the user and controlled by the controller 250.
The tuner demodulator 210 can receive a television signal in various ways according to the broadcasting system of the television signal, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting, internet broadcasting, or the like; and according to different modulation types, a digital modulation mode or an analog modulation mode can be adopted; and can demodulate the analog signal and the digital signal according to the different kinds of the received television signals.
In other exemplary embodiments, the tuning demodulator 210 may also be in an external device, such as an external set-top box. In this way, the set-top box outputs a television signal after modulation and demodulation, and inputs the television signal into the display apparatus 200 through the external device interface 240.
The communicator 220 is a component for communicating with an external device or an external server according to various communication protocol types. For example, the display apparatus 200 may transmit content data to an external apparatus connected via the communicator 220, or browse and download content data from an external apparatus connected via the communicator 220. The communicator 220 may include a network communication protocol module or a near field communication protocol module, such as a WIFI module 221, a bluetooth communication protocol module 222, and a wired ethernet communication protocol module 223, so that the communicator 220 may receive a control signal of the control device 100 according to the control of the controller 250 and implement the control signal as a WIFI signal, a bluetooth signal, a radio frequency signal, and the like.
The detector 230 is a component of the display apparatus 200 for collecting signals of an external environment or interaction with the outside. The detector 230 may include an image collector 231, such as a camera, a video camera, etc., which may be used to collect external environment scenes to adaptively change the display parameters of the display device 200; and the function of acquiring the attribute of the user or interacting gestures with the user so as to realize the interaction between the display equipment and the user. A light receiver 232 may also be included to collect ambient light intensity to adapt to changes in display parameters of the display device 200, etc.
In some other exemplary embodiments, the detector 230 may further include a temperature sensor, such as by sensing an ambient temperature, and the display device 200 may adaptively adjust a display color temperature of the image. For example, when the temperature is higher, the display apparatus 200 may be adjusted to display a color temperature of an image that is cooler; when the temperature is lower, the display device 200 may be adjusted to display a warmer color temperature of the image.
In some other exemplary embodiments, the detector 230, which may further include a sound collector, such as a microphone, may be configured to receive a sound of a user, such as a voice signal of a control instruction of the user to control the display device 200; alternatively, ambient sounds may be collected that identify the type of ambient scene, enabling the display device 200 to adapt to ambient noise.
The external device interface 240 is a component for providing the controller 210 to control data transmission between the display apparatus 200 and an external apparatus. The external device interface 240 may be connected to an external apparatus such as a set-top box, a game device, a notebook computer, etc. in a wired/wireless manner, and may receive data such as a video signal (e.g., moving image), an audio signal (e.g., music), additional information (e.g., EPG), etc. of the external apparatus.
The external device interface 240 may include: a High Definition Multimedia Interface (HDMI) terminal 241, a Composite Video Blanking Sync (CVBS) terminal 242, an analog or digital Component terminal 243, a Universal Serial Bus (USB) terminal 244, a Component terminal (not shown), a red, green, blue (RGB) terminal (not shown), and the like.
The controller 250 controls the operation of the display device 200 and responds to the operation of the user by running various software control programs (such as an operating system and various application programs) stored on the memory 260.
As shown in fig. 1C, the controller 250 includes a Random Access Memory (RAM)251, a Read Only Memory (ROM)252, a graphics processor 253, a CPU processor 254, a communication interface 255, and a communication bus 256. The RAM251, the ROM252, the graphic processor 253, and the CPU processor 254 are connected to each other through a communication bus 256 through a communication interface 255.
The ROM252 stores various system boot instructions. When the display apparatus 200 starts power-on upon receiving the power-on signal, the CPU processor 254 executes a system boot instruction in the ROM252, copies the operating system stored in the memory 260 to the RAM251, and starts running the boot operating system. After the start of the operating system is completed, the CPU processor 254 copies the various application programs in the memory 260 to the RAM251 and then starts running and starting the various application programs.
A graphic processor 253 for generating screen images of various graphic objects such as icons, images, and operation menus. The graphic processor 253 may include an operator for performing an operation by receiving various interactive instructions input by a user, and further displaying various objects according to display attributes; and a renderer for generating various objects based on the operator and displaying the rendered result on the display 275.
A CPU processor 254 for executing operating system and application program instructions stored in memory 260. And according to the received user input instruction, processing of various application programs, data and contents is executed so as to finally display and play various audio-video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and a plurality of or one sub-processor. A main processor for performing some initialization operations of the display apparatus 200 in the display apparatus preload mode and/or operations of displaying a screen in the normal mode. A plurality of or one sub-processor for performing an operation in a state of a standby mode or the like of the display apparatus.
The communication interface 255 may include a first interface to an nth interface. These interfaces may be network interfaces that are connected to external devices via a network.
The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user input command for selecting a GUI object displayed on the display 275, the controller 250 may perform an operation related to the object selected by the user input command.
Where the object may be any one of the selectable objects, such as a hyperlink or an icon. The operation related to the selected object is, for example, an operation of displaying a link to a hyperlink page, document, image, or the like, or an operation of executing a program corresponding to an icon. The user input command for selecting the GUI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch panel, etc.) connected to the display apparatus 200 or a voice command corresponding to a user uttering voice.
A memory 260 for storing various types of data, software programs, or applications for driving and controlling the operation of the display device 200. The memory 260 may include volatile and/or nonvolatile memory. And the term "memory" includes the memory 260, the RAM251 and the ROM252 of the controller 250, or a memory card in the display device 200.
In some embodiments, the memory 260 is specifically used for storing an operating program for driving the controller 250 of the display device 200; storing various application programs built in the display apparatus 200 and downloaded by a user from an external apparatus; data such as visual effect images for configuring various GUIs provided by the display 275, various objects related to the GUIs, and selectors for selecting GUI objects are stored.
In some embodiments, the memory 260 is specifically configured to store drivers and related data for the tuner demodulator 210, the communicator 220, the detector 230, the external device interface 240, the video processor 270, the display 275, the audio processor 280, and the like, external data (e.g., audio-visual data) received from the external device interface, or user data (e.g., key information, voice information, touch information, and the like) received from the user interface.
In some embodiments, memory 260 specifically stores software and/or programs representing an Operating System (OS), which may include, for example: a kernel, middleware, an Application Programming Interface (API), and/or an application program. Illustratively, the kernel may control or manage system resources, as well as functions implemented by other programs (e.g., the middleware, APIs, or applications); at the same time, the kernel may provide an interface to allow middleware, APIs, or applications to access the controller to enable control or management of system resources.
A block diagram of the architectural configuration of the operating system in the memory of the display device 200 is illustrated in fig. 1D. The operating system architecture comprises an application layer, a middleware layer and a kernel layer from top to bottom.
The application layer, the application programs built in the system and the non-system-level application programs belong to the application layer and are responsible for direct interaction with users. The application layer may include a plurality of applications such as NETFLIX applications, setup applications, media center applications, and the like. These applications may be implemented as Web applications that execute based on a WebKit engine, and in particular may be developed and executed based on HTML, Cascading Style Sheets (CSS), and JavaScript.
Here, HTML, which is called HyperText Markup Language (HyperText Markup Language), is a standard Markup Language for creating web pages, and describes the web pages by Markup tags, where the HTML tags are used to describe characters, graphics, animation, sound, tables, links, etc., and a browser reads an HTML document, interprets the content of the tags in the document, and displays the content in the form of web pages.
CSS, known as Cascading Style Sheets (Cascading Style Sheets), is a computer language used to represent the Style of HTML documents, and may be used to define Style structures, such as fonts, colors, locations, etc. The CSS style can be directly stored in the HTML webpage or a separate style file, so that the style in the webpage can be controlled.
JavaScript, a language applied to Web page programming, can be inserted into an HTML page and interpreted and executed by a browser. The interaction logic of the Web application is realized by JavaScript. The JavaScript can package a JavaScript extension interface through a browser, realize the communication with the kernel layer,
the middleware layer may provide some standardized interfaces to support the operation of various environments and systems. For example, the middleware layer may be implemented as multimedia and hypermedia information coding experts group (MHEG) middleware related to data broadcasting, DLNA middleware which is middleware related to communication with an external device, middleware which provides a browser environment in which each application program in the display device operates, and the like.
The kernel layer provides core system services, such as: file management, memory management, process management, network management, system security authority management and the like. The kernel layer may be implemented as a kernel based on various operating systems, for example, a kernel based on the Linux operating system.
The kernel layer also provides communication between system software and hardware, and provides device driver services for various hardware, such as: provide display driver for the display, provide camera driver for the camera, provide button driver for the remote controller, provide wiFi driver for the WIFI module, provide audio driver for audio output interface, provide power management drive for Power Management (PM) module etc..
A user interface 265 receives various user interactions. Specifically, it is used to transmit an input signal of a user to the controller 250 or transmit an output signal from the controller 250 to the user. For example, the remote controller 100A may transmit an input signal, such as a power switch signal, a channel selection signal, a volume adjustment signal, etc., input by the user to the user interface 265, and then the input signal is transferred to the controller 250 through the user interface 265; alternatively, the remote controller 100A may receive an output signal such as audio, video, or data output from the user interface 265 via the controller 250, and display the received output signal or output the received output signal in audio or vibration form.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on the display 275, and the user interface 265 receives the user input commands through the GUI. Specifically, the user interface 265 may receive user input commands for controlling the position of a selector in the GUI to select different objects or items.
Alternatively, the user may input a user command by inputting a specific sound or gesture, and the user interface 265 receives the user input command by recognizing the sound or gesture through the sensor.
The video processor 270 is configured to receive an external video signal, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a video signal that is directly displayed or played on the display 275.
Illustratively, the video processor 270 includes a demultiplexing module, a video decoding module, an image synthesizing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is configured to demultiplex an input audio/video data stream, where, for example, an input MPEG-2 stream (based on a compression standard of a digital storage media moving image and voice), the demultiplexing module demultiplexes the input audio/video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert a frame rate of an input video, for example, convert a frame rate of an input 60Hz video into a frame rate of 120Hz or 240Hz, where a common format is implemented by using, for example, an interpolation frame method.
And a display formatting module for converting the signal output by the frame rate conversion module into a signal conforming to a display format of a display, such as converting the format of the signal output by the frame rate conversion module to output an RGB data signal.
And a display 275 for receiving the image signal from the output of the video processor 270 and displaying video, images and menu manipulation interfaces. For example, the display may display video from a broadcast signal received by the tuner demodulator 210, may display video input from the communicator 220 or the external device interface 240, and may display an image stored in the memory 260. The display 275, while displaying a user manipulation interface UI generated in the display apparatus 200 and used to control the display apparatus 200.
And, the display 275 may include a display screen assembly for presenting a picture and a driving assembly for driving the display of an image. Alternatively, a projection device and projection screen may be included, provided display 275 is a projection display.
The audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played by the speaker 286.
Illustratively, audio processor 280 may support various audio formats. Such as MPEG-2, MPEG-4, Advanced Audio Coding (AAC), high efficiency AAC (HE-AAC), and the like.
Audio output interface 285 receives audio signals from the output of audio processor 280. For example, the audio output interface may output audio in a broadcast signal received via the tuner demodulator 210, may output audio input via the communicator 220 or the external device interface 240, and may output audio stored in the memory 260. The audio output interface 285 may include a speaker 286, or an external audio output terminal 287, such as an earphone output terminal, that outputs to a generating device of an external device.
In other exemplary embodiments, video processor 270 may comprise one or more chips. Audio processor 280 may also comprise one or more chips.
And, in other exemplary embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated with the controller 250 in one or more chips.
And a power supply 290 for supplying power supply support to the display apparatus 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may be a built-in power supply circuit installed inside the display apparatus 200 or may be a power supply installed outside the display apparatus 200.
For the convenience of understanding the embodiments provided in the present application, the technical solutions in the embodiments of the present application will be further described in detail below with reference to the accompanying drawings. As an embodiment, the video playing management method provided by the present application may be divided into at least two flows, where a first flow may refer to fig. 2, and fig. 2 is a flow chart of responding to opening a designated tab page in the video playing management method provided by the embodiment of the present application. As an embodiment, the flow shown in fig. 2 may be applied to the display device as described above.
As shown in fig. 2, the process may include the following steps:
step 201, in response to opening the specified tab page, parsing the specified tab page to obtain each video element existing in the specified tab page.
As an embodiment, the tab page specified in this step 201 is any tab page that can be opened in the browser. Upon detecting an instruction to open a specified tab page, each video element present in the specified tab page may be found from the source code of the specified tab page by parsing the specified tab page.
In the embodiment of the present application, video refers to video played under a specified tab page. In a single tab page, one or more video can appear at the same time.
Alternatively, a single tabbed page may appear on a single video page with reference to the browser interface shown in FIG. 5. As shown in fig. 5, three tab pages, tab1, tab2 and tab3, are opened in the browser, a playing window of video exists in each tab page, the page of tab1 in the current browser is displayed in the foreground, and tab2 and tab3 are all switched to the background.
Alternatively, a single tabbed page showing multiple video pages may refer to the browser interface shown in fig. 6. The playing of multiple video in tab1 shown in fig. 6 can now be generally manually manipulated to achieve the effect that only one video is played in a single tab when multiple video videos appear in that tab.
At step 202, a corresponding video object is created for each video element.
Optionally, after creating a corresponding video object for each video element, a corresponding video identifier (denoted as id of each video object) may be allocated to each video object, so as to record registration of each video object by the video identifier, and the video identifier corresponding to each video is different. Meanwhile, in order to manage the playing of the video in different tab pages, the corresponding relationship between the video identifier and the designated tab interface can be further recorded.
In a specific implementation, in order to facilitate management of the video identifier corresponding to each video object, an id Manager (id Manager) may be created through a program, and the id Manager may be used as a state machine (denoted as a first state machine) for recording a life cycle of the video object, and is used for recording creation and cancellation of the video object. Through the id Manager, when the existence of a video object is detected to be created, an id is automatically allocated to the video object, and when the video object is detected to be logged out, the id corresponding to the video object is logged out at the same time.
Step 203, a video resource required for the video playing is created for each video object.
Optionally, the video resource may include a video backend media and a display resource required for video playing. The video back end is used for providing system resources to realize video playing, and the display resources are used for displaying pictures played by the video. The creation of the video back end and the display resource may be implemented by codes, and specific reference may be made to related technologies, which are not described herein again.
As an embodiment, the logging off the video object may be triggered to log off the video object when it is recognized that the playing state of the video corresponding to the video object is the end state, release the video resource corresponding to the video object, and delete the corresponding relationship between the video object and the video identifier.
Thus, the flow shown in fig. 2 is completed. With the embodiment shown in fig. 2, a video object may be created for each video element in the opened tab page, and a video resource required for playing the video object may be created for each video object, so as to manage the playing of the video object in the tab page for each video object.
Referring to fig. 3, fig. 3 is a flowchart of responding to switching the designated tab to the background in the video playing management method according to the embodiment of the present application. As an embodiment, the flow shown in fig. 3 may be applied to the display apparatus as described above together with the flow shown in fig. 2.
As shown in fig. 3, the process may include the following steps:
step 301, in response to switching the designated tab page to the background, recording the current playing state of the video corresponding to each video object.
As an embodiment, after each video object is created, a state machine (denoted as a second state machine) for recording the video playing status may be created to record the video playing status corresponding to each video object.
As for the specific playing state of the video stored in the second state machine in this embodiment, detailed description will be given below when the playing state of the video is introduced, and details are not described here again.
Optionally, if an instruction to switch the specified tab page to the background is received, in response to the instruction, the page of the specified tab page is hidden in the background, and in the process of hiding, the current playing state of the video corresponding to each video object in the specified tab page may be recorded in the second state machine, and then the playing of all the videos in the specified tab page is paused.
Step 302, releasing the video resource corresponding to each video object.
In this embodiment of the present application, after the playing of all the video in the specified tab page is paused, the video resource corresponding to each video object in the specified tab page may be released, for example, the media backup and the display resource created for the video object when the video object is created are released, so as to reduce the resource occupied by the playing of each video in the specified tab page.
It should be noted that, here, switching the designated tab page to the background may be switching a page displayed in the browser from the designated tab page to another tab page, or switching the browser to the background. When the designated tab page is switched to another tab page, if the other tab page is a new tab page, the new tab page is parsed, that is, the process shown in fig. 1 is repeated to create a video object for each video element in the new tab page, and play the video in the new tab page.
The flow shown in fig. 3 is completed.
It should be noted that the creation of the video back end media back may occupy system resources, and in the related art, when the specified tab is switched to the background, the video back end media back corresponding to the video playing is not released, and when the specified tab is switched to the background, the playing problem, such as a black screen of a playing window, is easily caused.
Therefore, through the embodiment shown in fig. 3, in response to switching the specified tab page to the background, by releasing the video resource corresponding to each video object in the specified tab page, system resources occupied by the videos in the specified tab page in the background can be reduced, so as to manage video playing in the specified tab page, reduce the problem of browser jamming caused by excessive resources occupied by video playing, and avoid a black screen caused by direct release of the resources occupied by video playing.
A flow diagram for responding to switching a specified tab page from background to foreground in a video playback management method is described below. As an embodiment, the flow shown in fig. 4 may be applied to the display apparatus as described above together with the flows shown in fig. 2 and 3.
Step 401, in response to switching the designated tab page from the background to the foreground, acquiring the recorded playing state of the video corresponding to each video object.
As an embodiment, if the designated tab page is switched to the background based on the method shown in fig. 3, when an instruction to switch the designated tab page from the background to the foreground is received, the playing state of the video corresponding to each video object may be obtained from the created second state machine.
Step 402, creating a corresponding video resource for each video object according to the playing state of the video corresponding to each video object, so as to recover the playing state of each video in the specified tab page before being switched to the background.
Optionally, after the playing state of the video corresponding to each video object is obtained, a media backup and a display resource may be created again for each video object, and then the playing state of the video corresponding to the obtained video object is synchronized to the media backup and the display resource, so that the playing state of each video in the specified tab page is restored before being switched to the background.
The flow shown in fig. 4 is completed. Through the embodiment shown in fig. 4, when the specified tab page is switched from the background to the foreground, the playing state of each video object in the specified tab page recorded in the second state machine before being switched to the background can be recovered through the playing state corresponding to each video object in the specified tab page, so as to avoid that when resources occupied by video playing are large, resources occupied by video playing are directly released when the tab page is switched to other tab pages or the tab page is switched to the background, and thus the video playing progress cannot be recovered.
The following describes a playing state of video provided in an embodiment of the present application. As shown in fig. 7, the video may include at least the following 8 play states, which are saved in the second state machine:
1. idle state: after parsing the video elements in the designated tab page, the video is in the Idle state when one video object is created for each video element.
2. Input (initialized) state: after the play address class url (uniform resource locator) of the video is put into the player, the video is transferred from the Idle state to the input state.
3. Loaded state: before playing the video, the playing data needs to be preloaded, at this time, the video object may also have a short preparation state, at this time, a Seek operation may also be executed, the next preparation state is entered after the Seek operation is executed, and the video will enter a Loaded state after the playing data is Loaded.
It should be noted that the playing data acquired by the player at this time is acquired from the playing address class url of the above-mentioned video, and the Seek operation here may be to retrieve the specified program.
4. Play (play) state: after the above-mentioned broadcast data load is finished, if the automatic play (automatic play) function is set up, the video can enter the Playing state immediately; if the AutoPlay function is not set, it is necessary to enter this state by manually performing the on-demand operation, and the Seek operation can be performed again before entering this state.
5. Used (Paused) state: when a pause command is received during video playback, the video playback device enters this state, and when a playback command is received again after the video playback device enters this state, the video playback device enters the Playing state.
6. Finished (end) state: when the whole video playing is Finished, the video will enter a Finished state, at this time, the logout of the video object may be triggered, and after the video enters the state, a Seek operation may also be performed.
7. Stopped state: in the above Loaded state, Playing state, used state and Finished state, if an instruction to stop Playing the video is called, the Playing of the video can be made to enter the Stopped state, and if the video in the Stopped state needs to be played again, the resume operation needs to be executed to return to the Loaded state first, and the Playing data is reloaded.
8. Error status: failure of video playback due to various reasons, such as: the current device does not support the format of the audio/video played by the video, the loading request timeout of the video playing data, etc., and the Error state may occur before the video object is released.
Further, the embodiment of the present application also provides a state specifying a life cycle of a video object Created by a video element in a tab page, where the state is stored in a first state machine and includes a Created state and an end state. When receiving an instruction of the Created video object and creating the completed video object, the video object enters a Created state; upon receiving an instruction to unregister a video object, the video object will enter an Ended state.
It should be noted that the above states are only for easy understanding, and the present application does not limit various playing states corresponding to the video.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a controller coupled with the display and configured to:
responding to the opening of the specified label page, analyzing the specified label page to obtain each video element existing in the specified label page, creating a corresponding video object for each video element, and creating a video resource required by the video playing for each video object;
and responding to the switching of the specified label page to a background, recording the current playing state of the video corresponding to each video object, and releasing the video resource corresponding to each video object.
2. The display device of claim 1, wherein the controller is further configured to:
and in response to switching the designated label page from the background to the foreground, acquiring the recorded playing state of the video corresponding to each video object, and creating a corresponding video resource for each video object according to the playing state of the video corresponding to each video object, so as to recover the playing state of each video in the designated label page before being switched to the background.
3. The display device of claim 1, wherein the controller is further configured to:
and allocating a corresponding video identifier for each video object so as to record the registration of the video object through the video identifier, wherein the video identifier corresponding to each video is different.
4. The display device of claim 1, wherein the controller is further configured to:
and when the playing state of the video corresponding to any video object is identified to be the end state, releasing the video resource corresponding to the video object, and deleting the corresponding relation between the video object and the video identifier to log out the video object.
5. The display device according to any one of claims 1 to 4, wherein the video resources comprise a video back end and a display resource;
the video back end is used for providing system resources to realize video playing, and the display resources are used for displaying pictures played by the video.
6. A video playing management method is characterized by comprising the following steps:
responding to the opening of the specified label page, analyzing the specified label page to obtain each video element existing in the specified label page, creating a corresponding video object for each video element, and creating a video resource required by the video playing for each video object;
and responding to the switching of the specified label page to a background, recording the current playing state of the video corresponding to each video object, and releasing the video resource corresponding to each video object.
7. The method of claim 6, further comprising:
and in response to switching the designated label page from the background to the foreground, acquiring the recorded playing state of the video corresponding to each video object, and creating a corresponding video resource for each video object according to the playing state of the video corresponding to each video object, so as to recover the playing state of each video in the designated label page before being switched to the background.
8. The method of claim 6, further comprising:
and allocating a corresponding video identifier for each video object so as to record the registration of the video object through the video identifier, wherein the video identifier corresponding to each video is different.
9. The method of claim 8, further comprising:
and when the playing state of the video corresponding to any video object is identified to be the end state, releasing the video resource corresponding to the video object, and deleting the corresponding relation between the video object and the video identifier to log out the video object.
10. The method according to any one of claims 6 to 9, wherein the video resources comprise a video back end media back and a display resource;
the video back end is used for providing system resources to realize video playing, and the display resources are used for displaying pictures played by the video.
CN202110691835.6A 2021-06-22 2021-06-22 Video playing management method and display equipment Pending CN113378092A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110691835.6A CN113378092A (en) 2021-06-22 2021-06-22 Video playing management method and display equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110691835.6A CN113378092A (en) 2021-06-22 2021-06-22 Video playing management method and display equipment

Publications (1)

Publication Number Publication Date
CN113378092A true CN113378092A (en) 2021-09-10

Family

ID=77578246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110691835.6A Pending CN113378092A (en) 2021-06-22 2021-06-22 Video playing management method and display equipment

Country Status (1)

Country Link
CN (1) CN113378092A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810777A (en) * 2021-09-16 2021-12-17 上海哔哩哔哩科技有限公司 Playing method and device
CN114071212A (en) * 2021-11-15 2022-02-18 北京字节跳动网络技术有限公司 Information display processing method and device and electronic equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104572950A (en) * 2014-12-30 2015-04-29 乐视致新电子科技(天津)有限公司 Memory release method for browser client and browser client
CN105828195A (en) * 2016-03-10 2016-08-03 乐视云计算有限公司 Suspending resuming method and apparatus of player
CN106528735A (en) * 2016-10-27 2017-03-22 北京小米移动软件有限公司 Method and device for controlling browser to play media resources
CN108429930A (en) * 2018-04-13 2018-08-21 小草数语(北京)科技有限公司 The control method and device of video playing in browser
CN110730384A (en) * 2018-07-17 2020-01-24 腾讯科技(北京)有限公司 Webpage control method and device, terminal equipment and computer storage medium
CN112004125A (en) * 2020-08-20 2020-11-27 海信视像科技股份有限公司 Media resource playing method and display equipment
CN112948003A (en) * 2019-11-26 2021-06-11 成都鼎桥通信技术有限公司 Method and equipment for switching operating systems of dual-android terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104572950A (en) * 2014-12-30 2015-04-29 乐视致新电子科技(天津)有限公司 Memory release method for browser client and browser client
CN105828195A (en) * 2016-03-10 2016-08-03 乐视云计算有限公司 Suspending resuming method and apparatus of player
CN106528735A (en) * 2016-10-27 2017-03-22 北京小米移动软件有限公司 Method and device for controlling browser to play media resources
CN108429930A (en) * 2018-04-13 2018-08-21 小草数语(北京)科技有限公司 The control method and device of video playing in browser
CN110730384A (en) * 2018-07-17 2020-01-24 腾讯科技(北京)有限公司 Webpage control method and device, terminal equipment and computer storage medium
CN112948003A (en) * 2019-11-26 2021-06-11 成都鼎桥通信技术有限公司 Method and equipment for switching operating systems of dual-android terminal
CN112004125A (en) * 2020-08-20 2020-11-27 海信视像科技股份有限公司 Media resource playing method and display equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810777A (en) * 2021-09-16 2021-12-17 上海哔哩哔哩科技有限公司 Playing method and device
CN113810777B (en) * 2021-09-16 2024-03-01 上海哔哩哔哩科技有限公司 Playing method and device
CN114071212A (en) * 2021-11-15 2022-02-18 北京字节跳动网络技术有限公司 Information display processing method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN111447498A (en) Awakening method of display equipment and display equipment
CN111294643A (en) Method for displaying audio track language in display device and display device
WO2021169168A1 (en) Video file preview method and display device
CN111601142B (en) Subtitle display method and display equipment
CN111726673B (en) Channel switching method and display device
CN111654743B (en) Audio playing method and display device
CN111343492B (en) Display method and display device of browser in different layers
CN113378092A (en) Video playing management method and display equipment
CN112272373B (en) Bluetooth device type switching method and display device
CN111526401B (en) Video playing control method and display equipment
CN112040308A (en) HDMI channel switching method and display device
CN111726674A (en) HbbTV application starting method and display equipment
CN112040285B (en) Interface display method and display equipment
CN111885415B (en) Audio data rapid output method and display device
CN111405329B (en) Display device and control method for EPG user interface display
CN113010074A (en) Webpage Video control bar display method and display equipment
CN112004127A (en) Signal state display method and display equipment
CN111614995A (en) Menu display method and display equipment
CN111757160A (en) Method for starting sports mode and display equipment
CN113382291A (en) Display device and streaming media playing method
CN112040317B (en) Event response method and display device
CN111246282B (en) Program information acquisition method in display equipment and display equipment
CN111901686B (en) Method for keeping normal display of user interface stack and display equipment
CN111405332B (en) Display device and control method for EPG user interface display
CN111562887B (en) Display device and partition capacity expansion method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210910