CN117812362A - Display equipment and playing dynamic memory recycling method - Google Patents

Display equipment and playing dynamic memory recycling method Download PDF

Info

Publication number
CN117812362A
CN117812362A CN202310532841.6A CN202310532841A CN117812362A CN 117812362 A CN117812362 A CN 117812362A CN 202310532841 A CN202310532841 A CN 202310532841A CN 117812362 A CN117812362 A CN 117812362A
Authority
CN
China
Prior art keywords
data
buffer queue
frame
time point
media
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310532841.6A
Other languages
Chinese (zh)
Inventor
汤雯
廖院松
白向军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Vidaa Netherlands International Holdings BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vidaa Netherlands International Holdings BV filed Critical Vidaa Netherlands International Holdings BV
Priority to CN202310532841.6A priority Critical patent/CN117812362A/en
Publication of CN117812362A publication Critical patent/CN117812362A/en
Pending legal-status Critical Current

Links

Abstract

Some embodiments of the present application provide a display device and a playing dynamic memory reclamation method, where the method may read media data of a buffer queue in response to a playing instruction of the media data, and record a target time point of reading target data. The buffer queue comprises cached media data, and the target data is media data with the latest reading time point. When the buffer queue adds data, the data frame type of the target data is detected. If the data frame type is an intra-frame coding frame, releasing media data earlier than a target time point in a buffer queue; if the data frame type is the forward predictive coding frame, inquiring the intra-frame coding frame earlier than the target time point in the buffer queue, and releasing the media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue. According to the method, the release position is determined through the currently read media data, and the media data of the buffer queue is recovered according to the release position, so that the memory occupation in the playing process can be reduced, and the playing efficiency of the media data is improved.

Description

Display equipment and playing dynamic memory recycling method
Technical Field
The present disclosure relates to the field of display devices, and in particular, to a display device and a method for recovering a playing dynamic memory.
Background
The display device refers to a terminal device capable of outputting a specific display screen, and may be a terminal device such as a smart television, a communication terminal, a smart advertisement screen, and a projector. Taking intelligent electricity as an example, the intelligent television is based on the Internet application technology, has an open operating system and a chip, has an open application platform, can realize a bidirectional man-machine interaction function, and is a television product integrating multiple functions of video, entertainment, data and the like, and the intelligent television is used for meeting the diversified and personalized requirements of users.
The display device may download the play data based on the browser to play the corresponding media asset through the display device. For example, when the browser plays media resources in a Media Source Extension (MSE) mode, the browser downloads the play data and then actively pushes the play data through Javascript. Thus, the display device may continuously add play data to the buffer queue while playing media.
However, as the play data in the buffer queue increases, the memory occupied by the buffer queue increases. Excessive memory occupation can increase the consumption of system resources of the display device, so that the response speed of playing the media data by the display device is reduced, and the efficiency of playing the media data by the display device is reduced.
Disclosure of Invention
The application provides a display device and a playing dynamic memory recycling method, which are used for solving the problem of efficiency reduction when the display device plays media data.
In a first aspect, some embodiments of the present application provide a display device including a display and a controller. Wherein the display is configured to display a user interface and the controller is configured to perform the following program steps:
responding to a playing instruction of the media data, reading the media data of the buffer queue, and recording a target time point of reading target data; the buffer queue comprises cached media data, and the target data is media data with latest reading time point;
detecting the data frame type of the target data when the buffer queue adds the data;
if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
In some embodiments, the controller performs reading of the media asset data of the buffer queue, and is further configured to download the media asset data according to a network transport protocol of the media asset data; caching the media asset data into the buffer queue; writing the media data of the buffer queue into a sharing module based on a rendering process of a browser, wherein the sharing module is a storage module with preset memory capacity; and reading the media data of the sharing module through a main process of the browser.
In some embodiments, the controller is further configured to transfer the media asset data read by the host process to a middleware layer to write the media asset data to a playback platform of a browser; monitoring the data volume of media data in the playing platform; and when the data quantity is greater than or equal to the data quantity threshold, sending the media asset data to a decoding queue, and decoding the media asset data of the decoding queue.
In some embodiments, the controller is configured to monitor a playing progress of the media asset data and a buffering time point, where the buffering time point is a time point corresponding to media asset data with a latest buffering time in the buffer queue; acquiring a playing time point of the media asset data according to the playing progress; and outputting a buffer time range according to the play time point and the buffer time point.
In some embodiments, the controller performs outputting a buffer time range according to the play time point and the buffer time point, configured to set the play time point to a left value of the buffer time range, and to set the buffer time point to a right value of the buffer time range; and returning to the buffering time range.
In some embodiments, the controller performs releasing of media data in the buffer queue earlier than the target point in time, configured to query a target location of the target point in time in the buffer queue; the media data in the buffer queue are arranged according to the data time points of the media data, and the data time points at the rear position of the buffer queue are earlier than the data time points at the front position of the buffer queue; and releasing the media data positioned at the rear part of the target position in the buffer queue.
In some embodiments, the controller performs a query of an intra-coded frame in the buffer queue that is earlier than the target point in time, configured to query a target location of the target point in time in the buffer queue; the media data in the buffer queue are arranged according to the data time points of the media data, and the data time points at the rear position of the buffer queue are earlier than the data time points at the front position of the buffer queue; inquiring the read data positioned at the rear part of the target position in the buffer queue; detecting the type of the coded frame of the read data; and marking the intra-frame coding frame closest to the target position in the read data as a recovery frame, wherein the recovery frame is used for determining the release position of the read data.
In some embodiments, the controller performs releasing of the media asset data in the buffer queue earlier than the corresponding time point of the intra-coded frame, configured to obtain a reclaimed frame position of the reclaimed frame in the buffer queue; and releasing the read data positioned at the rear part of the recycling frame position in the buffer queue.
In some embodiments, the controller is further configured to receive a data reclamation instruction for deleting data; detecting a data frame type of the target data in response to the data reclamation instruction; if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue; and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
In a second aspect, some embodiments of the present application further provide a playback dynamic memory reclamation method, including:
responding to a playing instruction of the media data, reading the media data of the buffer queue, and recording a target time point of reading target data; the buffer queue comprises cached media data, and the target data is media data with latest reading time point;
Detecting the data frame type of the target data when the buffer queue adds the data;
if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
According to the technical scheme, the display device and the playing dynamic memory recycling method provided by some embodiments of the present application can respond to the playing instruction of the media data, read the media data of the buffer queue, and record the target time point of reading the target data. The buffer queue comprises cached media data, and the target data is media data with the latest reading time point. When the buffer queue adds data, the data frame type of the target data is detected. If the data frame type is an intra-frame coding frame, releasing media data earlier than a target time point in a buffer queue; if the data frame type is the forward predictive coding frame, inquiring the intra-frame coding frame earlier than the target time point in the buffer queue, and releasing the media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue. According to the method, the release position is determined through the currently read media data, and the media data of the buffer queue is recovered according to the release position, so that the memory occupation in the playing process can be reduced, and the efficiency of playing the media data by the display equipment is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control device according to some embodiments of the present application;
fig. 2 is a schematic hardware configuration diagram of a display device according to some embodiments of the present application;
fig. 3 is a schematic hardware configuration diagram of a control device according to some embodiments of the present application;
fig. 4 is a schematic software configuration diagram of a display device according to some embodiments of the present application;
FIG. 5 is a schematic diagram of an application program of a display device according to some embodiments of the present application;
FIG. 6 is a schematic diagram illustrating a data storage effect of a buffer queue when memory reclamation is not performed according to some embodiments of the present disclosure;
FIG. 7 is a schematic diagram illustrating a data storage effect of a buffer queue after performing memory reclamation according to some embodiments of the present disclosure;
FIG. 8 is a flowchart illustrating a playback dynamic memory recycling method according to some embodiments of the present disclosure;
FIG. 9 is a flow chart of downloading, reading and decoding media asset data according to some embodiments of the present application;
FIG. 10 is a flow chart of MES data provided in some embodiments of the present application;
FIG. 11 is a flow chart of memory reclamation when the data frame type provided in some embodiments of the present application is an intra-frame encoded frame;
FIG. 12 is a flowchart illustrating memory reclamation when the data frame type is an intra-coded frame according to some embodiments of the present application;
FIG. 13 is a flow chart of memory reclamation when the data frame type provided in some embodiments of the present application is a forward predictive coding frame;
FIG. 14 is a flowchart illustrating memory reclamation when the data frame type is a forward predictive encoded frame according to some embodiments of the present application;
FIG. 15 is a schematic diagram of data storage in a buffer queue after performing a playback dynamic memory reclamation method according to some embodiments of the present application;
FIG. 16 is a diagram illustrating the effect of buffering time ranges during non-execution of memory reclamation according to some embodiments of the present disclosure;
FIG. 17 is a schematic diagram illustrating the effect of the buffer time range after the playback dynamic memory reclamation method according to some embodiments of the present application;
Fig. 18 is a flowchart illustrating a method for releasing data from a buffer queue according to some embodiments of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the exemplary embodiments of the present application more apparent, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is apparent that the described exemplary embodiments are only some embodiments of the present application, but not all embodiments.
All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are intended to be within the scope of the present application based on the exemplary embodiments shown in the present application. Furthermore, while the disclosure has been presented in terms of an exemplary embodiment or embodiments, it should be understood that various aspects of the disclosure can be practiced separately from the disclosure in a complete subject matter.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate, such as where appropriate, for example, implementations other than those illustrated or described in accordance with embodiments of the present application.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided in the embodiment of the application may have various implementation forms, for example, may be a television, an intelligent television, a laser projection device, a display (monitor), an electronic whiteboard (electronic bulletin board), an electronic desktop (electronic table), and the like. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display device 200 through the smart device 300 or the control apparatus 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, and the display device 200 is controlled by a wireless or wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc.
In some embodiments, a smart device 300 (e.g., mobile terminal, tablet, computer, notebook, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device.
In some embodiments, the display device may receive instructions not using the smart device or control device described above, but rather receive control of the user by touch or gesture, or the like.
In some embodiments, the display device 200 may also perform control in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received through a module configured inside the display device 200 device for acquiring voice commands, or the voice command control of the user may be received through a voice control apparatus configured outside the display device 200 device.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction of a user and convert the operation instruction into an instruction recognizable and responsive to the display device 200, and function as an interaction between the user and the display device 200.
As shown in fig. 3, the display apparatus 200 includes at least one of a modem 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller includes a processor, a video processor, an audio processor, a graphics processor, RAM, ROM, a first interface for input/output to an nth interface.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, displaying video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, a projection device, or a projection screen.
The communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the control device 100 or the server 400 through the communicator 220.
A user interface, which may be used to receive control signals from the control device 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for capturing the intensity of ambient light; alternatively, the detector 230 includes an image collector such as a camera, which may be used to collect external environmental scenes, user attributes, or user interaction gestures, or alternatively, the detector 230 includes a sound collector such as a microphone, or the like, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, etc. The input/output interface may be a composite input/output interface formed by a plurality of interfaces.
The modem 210 receives broadcast television signals through a wired or wireless reception manner, and demodulates audio and video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like.
The controller 250 controls the operation of the display device and responds to the user's operations through various software control programs stored on the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command to select a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller includes at least one of a central processing unit (Central Processing Unit, CPU), video processor, audio processor, graphics processor (Graphics Processing Unit, GPU), RAM (Random Access Memory, RAM), ROM (Read-Only Memory, ROM), first to nth interfaces for input/output, a communication Bus (Bus), etc.
The user may input a user command through a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
A "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user, which enables conversion between an internal form of information and a user-acceptable form. A commonly used presentation form of the user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Fig. 4 illustrates a software configuration diagram in a display device 200 according to some embodiments. In some embodiments, as shown in fig. 4, the system of the display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
As shown in fig. 4, the system of the display device is divided into three layers, an application layer, a middleware layer, and a hardware layer, from top to bottom. In some embodiments, the system of the display device further includes a UI layer (not shown in the figure) located above the application layer, the UI layer receiving data transmissions of the application layer to enable a visual presentation of the display 260.
The application layer mainly comprises common applications on the television, and an application framework (Application Framework), wherein the common applications are mainly applications developed based on Browser, such as: HTML5 APPs; native applications (Native APPs);
the application framework (Application Framework) is a complete program model with all the basic functions required by standard application software, such as: file access, data exchange, and the interface for the use of these functions (toolbar, status column, menu, dialog box).
Native applications (Native APPs) may support online or offline, message pushing, or local resource access.
The middleware layer includes middleware such as various television protocols, multimedia protocols, and system components. The middleware can use basic services (functions) provided by the system software to connect various parts of the application system or different applications on the network, so that the purposes of resource sharing and function sharing can be achieved.
The hardware layer mainly comprises a HAL interface, hardware and a driver, wherein the HAL interface is a unified interface for all the television chips to be docked, and specific logic is realized by each chip. The driving mainly comprises: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, pressure sensor, etc.), and power supply drive, etc.
Fig. 5 is a schematic diagram of an application program of a display device according to some embodiments of the present application, where, as shown in fig. 5, an application program layer includes at least one icon control that an application program may display in a display, for example: a live television application icon control, an online play application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the online playing application may acquire the media data of different storage sources based on the Browser of the application, and then perform decoding on the acquired media data to online play the media data through the online playing application. For example, a browser of an online playing application may play the media asset data using MSE (Media Source Extensions, media source extension) to play, download the media asset data by the browser, and actively push the media asset data through Js (Javascript) to realize the playing of the media asset data.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display media data of the live television signal on the display device 200.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, the media center may be other than live television or video on demand, and the user may access various images or audio through the media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some application that is associated with a computer system or other device but that may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then run on display device 200.
In some embodiments, the Browser's processes in the application include a Browser process, a render process, a plug-in process, and the like. The Browser process is a main process of the Browser and is used for coordinating and controlling functions, and the Browser process is internally multithreaded. For example, the Browser process may implement programs such as interface display of the Browser, interaction with a user, entry of a web site bar, forward, backward, etc.; the Browser process can also manage various interfaces, create and destroy processes, file storage functions and the like. The render process is a browser kernel, which is internally multithreaded. The Renderer process may be used for interface rendering, script execution, event handling, etc. Plug-in processes are created only when plug-ins are used, one for each type of plug-in.
In some embodiments, the browser's processes in the application program also include a GPU process, a web process, and an audio process. The GPU process is used for 3D drawing, and the rendering of elements of the 3D drawing can be turned to the GPU by the CPU, namely the GPU is started to accelerate. The network process is used for loading network resources of the interface, can be operated as a module in the browser, and can also be independently a single process. The audio process is used for audio management of the browser.
In some embodiments, the display device 200 may communicate with the server 400 according to a control instruction input by a user, to implement interaction of media asset data. The user may send control instructions to the display device 200 to launch a media playback application of the display device 200, such as an online playback application, or the like. After the media asset playing application is started, the display device 200 displays an application interface of the media asset playing application. The application interface of the media resource playing application can comprise a plurality of media resource items, each media resource item corresponds to a network link, and the network link is a uniform resource locator of media resource data corresponding to the media resource item.
Obviously, the display apparatus 200 may request the media asset data from the server 400 through the network link corresponding to the media asset item. That is, in some embodiments, during the process of the display apparatus 200 communicating with the server 400, the display apparatus 200 may transmit an acquisition request including a network link to the server 400 to acquire media asset data corresponding to the media asset item. After receiving the acquisition request sent by the display device 200, the server 400 extracts the corresponding media asset data according to the acquisition request, and returns the extracted media asset data to the display device 200.
To enable data interaction between the display device 200 and the server 400, in some embodiments, the display device 200 establishes a communication connection with the server 400 through a browser of an application to establish a transmission channel for media data. For example, both the display device 200 and the server 400 may be connected to the internet and the interaction data may be transferred between the display device 200 and the server 400 according to an internet transmission protocol.
It should be noted that, the display device 200 and the server 400 may also use other connection methods to establish a communication connection relationship. Such as wired broadband, wireless local area network, cellular network, bluetooth, infrared, radio frequency communications, etc.
In some embodiments, the display device 200 and the server 400 are in a "many-to-one" connection, i.e., multiple display devices 200 may establish a communication connection with the same server 400, such that the server 400 may provide services to multiple display devices 200. Alternatively, a "many-to-many" connection relationship may be provided between the display device 200 and the server 400, that is, the plurality of display devices 200 may establish communication connection with the plurality of servers 400, so that the plurality of servers 400 may respectively provide different services for the display device 200. It is obvious that in an individual application scenario, there may also be a "one-to-one" connection between the display device 200 and the server 400, i.e. one server 400 is dedicated to serve one display device 200.
Based on the above-described display apparatus 200, the display apparatus 200 can acquire the media asset data from the server 400 in real time through the browser, and continuously form a play screen of the media asset data through processing such as decoding. Also, in some embodiments, when the playing process of the media asset data is interrupted, the display apparatus 200 may automatically record the playing progress of the media asset data to generate a history. The history record can enable the user to directly continue playing according to the recorded playing progress when playing the media data again so as to meet the requirement of the user for continuing to watch.
In some embodiments, the data frames of the media data include I-frames (intra-coded frames) and P-frames (P-frames, forward predictive coded frames). Wherein I frames appear in different forms in different codecs, e.g. IDR, CRA or BLA, etc. An I-frame consists of only intra-predicted macroblocks, each macroblock in the frame can only be matched to other macroblocks within the same frame, and can only be compressed by intra-frame spatial redundancy, i.e., by similarity between pixels of a single frame. Different types of I frames are essentially identical, and there is no temporal prediction. Thus, I frames can be encoded and decoded independently. The P-frames represent predicted frames, which may be compressed by temporal prediction in addition to spatial prediction. The P-frame requires motion estimation with reference to the previous frame and each macroblock in the P-frame can be temporally predicted, spatially predicted, and skipped (skipped). Wherein the skip copies the co-located macroblock in the previous frame for the decoder, i.e. the 0 motion direction.
In some embodiments, the insertion of an I-frame indicates the end of a GOP (Group of Pictures ) or media asset segment. Compression of the I-frame does not rely on previous frame coding, so video quality can be refreshed. Thus, I frames can be used to preserve the quality of the media asset data. After encoding the high quality I-frame, the encoder may then use the I-frame as a reference picture and compress the P-frame based on the reference picture. Thus, if a P frame is corrupted, all other frames that depend on it cannot be fully decoded, directly resulting in failure of the media data, for example. However, when the stream of corrupted asset data arrives at an I-frame, the asset data can be recovered from the I-frame since the I-frame is independently encoded and decoded.
Also, in some embodiments, the display device 200 also monitors the status of the media asset items. When the media asset item is in the selected state, the display device 200 generates a play command of the media asset data corresponding to the media asset item, and sends an acquisition request of the media asset data to the server 400 through the browser of the application program in response to the play command, and establishes a transmission channel of the media asset data to load the media asset data corresponding to the media asset item.
In order to play the media asset data in the application program, in some embodiments, after the display device 200 downloads the media asset data through the Browser, the media asset data is cached in the buffer queue, and then the media asset data is read through the Browser process of the Browser, so as to perform decoding playing on the media asset data. Therefore, in order to continuously play the media asset data through the browser, the display device 200 continuously adds the media asset data to the buffer queue while playing the media asset data. Therefore, the data volume of the media data in the buffer queue is also increasing.
As shown in fig. 6, with the increase of the media data in the buffer queue and the playing process of the media data, more read data and unread data exist in the buffer queue, so that the memory occupied by the buffer queue is also increased. Wherein the read data includes played data and unrepeated data. The played data is the media data that has been played by the current display device 200, and the unreported data is the media data that has been read by the display device 200 but not played. Accordingly, to reduce memory usage of the buffer queue in the display device 200, in some embodiments, the display device 200 also monitors the amount of data of the media asset data in the buffer queue. When the data volume of the media data in the buffer queue is greater than or equal to a critical value, performing GC (gas collection) on the buffer queue to release the played data in the buffer queue, so as to reduce the memory occupation of the buffer queue.
However, since the memory reclamation of the buffer queue can be triggered only when the data amount of the buffer queue reaches the critical value, more played media data is still cached in the buffer queue during the playing process of the media data by the display device 200, so that a larger memory occupation is generated during the playing process of the media data.
In addition, when the buffer queue triggers memory reclamation, only the played data can be released, as shown in fig. 7, the read data and the unread data are still remained in the buffer queue when memory reclamation is performed. And since the read data has been read by the display device 200, it is stored in other memory blocks of the display device 200. Therefore, the read data in the buffer queue is the useless data repeatedly cached, and the memory occupation of the medium data in the buffer queue is increased. The increase of the memory occupation of the buffer queue affects the response speed of the media data, and also increases the hardware memory requirement of the display device 200, and affects the playing efficiency and the platform applicability of the media data in the display device 200.
Based on the above application scenario, in order to improve the problem of efficiency degradation when the display device 200 plays media data, some embodiments of the present application provide a playing dynamic memory recycling method, as shown in fig. 8, including the following steps:
s100: and responding to the playing instruction of the media data, reading the media data of the buffer queue, and recording the target time point of reading the target data.
After receiving the play command of the media asset data, the display device 200 responds to the play command, downloads the media asset data through the browser and caches the media asset data into the buffer queue before playing the media asset data. After playing the media data, the media data in the buffer queue is read, and the time point of the current reading target data is recorded in real time to serve as the target time point. The buffer queue comprises cached media data, and the target data is media data with the latest reading time point.
In some embodiments, the display device 200 downloads the media asset data according to the network transport protocol of the media asset data as it reads the media asset data of the buffer queue. For example, the display device 200 may establish a communication connection with the server 400 through a browser of the media asset playing application based on the internet, and the browser requests the server 400 to download the media asset data according to an internet transmission protocol of the media asset data. As shown in fig. 9, after the display device 200 downloads the media asset data, the media asset data is buffered in a buffer queue. And writing the media data of the buffer queue into the sharing module based on the rendering process of the browser. The sharing module is a storage module with preset memory capacity and is used for data interaction of two processes in the browser. After the display device 200 writes the media asset data into the sharing module, the media asset data of the sharing module is read through the main process of the browser, so as to perform decoding and playing on the media asset data.
For example: as shown in fig. 10, fig. 10 is a schematic flow chart of MES data provided in some embodiments of the present application, where BufferQueue is a buffer queue and ShareMem is a sharing module in fig. 10. After the media asset playing application selects the media asset item of the media asset data, the display device 200 generates a playing instruction of the media asset data, downloads the media asset data in response to the playing instruction, and caches the media asset data in the BufferQueue. And writing the media data of the buffer queue into the ShareMem through a Render process. The memory size of ShareMem is capable of storing 7.2s data frames, and is fixed in the playing process. After writing the media data of the buffer rQueue into the ShareMem, reading the media data of the ShareMem through a Browser process to execute decoding and playing on the media data.
In addition, in order to ensure the smoothness of playing the media data, the display device 200 needs to buffer certain media data on the playing platform before decoding the media data when decoding and playing the media data. Therefore, in some embodiments, after the display device 200 reads the media data of the shared module through the host process, the media data read by the host process is also transferred to the middleware layer, so as to write the media data into the playing platform of the browser. And monitoring the data quantity of the media data in the playing platform. And when the data quantity is greater than or equal to the data quantity threshold value, sending the media asset data to a decoding queue, and decoding the media asset data of the decoding queue.
For example: as shown in fig. 10, the decode queue in fig. 10 is a decode queue. After writing the media asset data to ShareMem through the Render process, the display device 200 reads the media asset data into the DecodeQue through the Browser process. The decoder is a memory card with a fixed playing length, and can store 3s of data frames. After buffering the data frame of 3s, the display device 200 starts decoding the decodque through the decoder to continuously display the play frame of playing the media data in the display 260.
S200: when the buffer queue adds data, the data frame type of the target data is detected.
When the display device 200 reads the media data in the buffer queue, the media data is added to the buffer queue continuously along with the playing process of the media data. Therefore, in order to reduce the memory occupation of the buffer queue, when the buffer queue adds data, the display device 200 detects whether there is recyclable media data in the buffer queue. For example, as can be seen from fig. 10, a portion of the data in the buffer queue has been read by the display device 200 and buffered in the sharing module and the decoding queue. Therefore, the media data cached in the sharing module and the decoding queue in the buffer queue can be released, and the memory occupation of the buffer queue is reduced. However, since the data frame of the media asset data includes a plurality of types, it is necessary to detect the data frame type of the data frame. The data frame types include intra-frame coded frames (I frames) and forward predictive coded frames (P frames).
Since the intra-frame encoded frames can be independently decoded and used for recovering the failure in the media data, in order to ensure the quality of the media data, when the display device 200 releases the read data in the buffer queue, it is also necessary to determine the release position of the media data according to the position of the intra-frame encoded frame of the target data. Therefore, when the buffer queue adds data, the display device 200 detects the data frame type of the target data, and searches the release position of the memory reclamation according to the data frame type of the target data, so as to determine whether the current buffer queue has recyclable data.
S300: and if the data frame type is an intra-frame coding frame, releasing the media data earlier than the target time point in the buffer queue.
After detecting the data frame type of the target data, the display device 200 determines the release position of the media data according to the data frame type of the target data. If the data frame type is an intra-frame code, which indicates that the read data in the buffer queue can be completely released, media data in the buffer queue earlier than the target time point is released.
Thus, when the data frame type of the target data is an intra-frame, as shown in fig. 11, in some embodiments, the display device 200 queries the target time point at the target position of the buffer queue when releasing the media data earlier than the target time point in the buffer queue. The medium resource data of the buffer queue are queued according to the data time point of the medium resource data, and the data time point at the rear position of the buffer queue is earlier than the data time point at the front position of the buffer queue. Since the data frame corresponding to the target time point is the media data being read by the current display device 200, the media data located at the rear of the data frame corresponding to the target time point in the buffer queue is the read data. Accordingly, when the data frame type intra-encodes a frame, the display apparatus 200 releases the media data located at the rear of the target position in the buffer queue.
For example: as shown in fig. 12, the data frames stored in the unfilled memory blocks in fig. 12 are P frames, and the data frames stored in the filled memory blocks are I frames. As shown in fig. 12, when the display device 200 adds media data to the buffer queue according to the play progress of the media play application, it detects that the target data currently read by the Render progress is the 32.60s data frame. Because the data frame of 32.60s is the I frame, the data frame earlier than 32.60s can be completely released, namely, the medium data at the rear part of the data frame of 32.60s is completely released, so that the memory occupation of a buffer queue is reduced.
S400: if the data frame type is the forward predictive coding frame, inquiring the intra-frame coding frame earlier than the target time point in the buffer queue, and releasing the media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
After the display device 200 detects the data frame type of the target data, if the data frame type is the forward predictive coding frame, it indicates that the read data in the buffer queue cannot be completely released. If the media data earlier than the corresponding time point of the forward predictive encoded frame is released entirely, an error may occur in the decoding performed by the display device 200. Therefore, the display device 200 queries the buffer queue for the intra-frame encoded frame earlier than the target time point, and releases the media data earlier than the corresponding time point of the intra-frame encoded frame in the buffer queue.
Thus, when the data frame type of the target data is a forward predictive encoded frame, as shown in fig. 13, in some embodiments, the display device 200 queries the buffer queue for a target position of the target time point in the buffer queue when it queries the intra-coded frame earlier than the target time point. The medium resource data in the buffer queue are arranged according to the data time points of the medium resource data, and the data time points at the rear positions of the buffer queue are earlier than the data time points at the front positions of the buffer queue. And then inquiring the read data positioned at the rear part of the target position in the buffer queue, and detecting the type of the coded frame of the read data. And marking the intra-frame coding frame closest to the target position in the read data as a recovery frame, namely detecting the coding frame type at the rear part of the target data according to the sequence of the buffer queue, and marking the detected first intra-frame coding frame as the recovery frame. Wherein the reclamation frame is used to determine the release location of the read data.
After determining the reclaimed frame, the display device 200 may then release the media asset data at the location of the buffer queue based on the reclaimed frame. Thus, in some embodiments, when the data frame type of the target data is a forward predictive encoded frame, media data in the buffer queue earlier than the point in time corresponding to the intra-frame encoded frame is released, the display device 200 acquires the reclaimed frame position of the reclaimed frame in the buffer queue, and releases the read data in the buffer queue located at the rear of the reclaimed frame position.
For example: as shown in fig. 14, the data frame stored in the unfilled memory block in fig. 14 is a P frame, and the data frame stored in the filled memory block is an I frame. As shown in fig. 14, when the display device 200 adds media data to the buffer queue according to the play progress of the media play application, it detects that the target data currently read by the Render progress is the 32.68s data frame. Since the 32.68s data frame is a P frame, all the data frames earlier than the 32.68s data frame cannot be released, and the media data at the rear part of the 32.68s data frame in the buffer queue is queried to determine the release position. As can be seen from fig. 14, the data frame of 32.60s is the I frame nearest to the data frame of 32.68s, and the display device 200 marks the data frame of 32.60s as a recovery frame, and releases the media data at the rear of the data frame of 32.60s, thereby reducing the memory occupation of the buffer queue.
As can be seen from the above embodiments, in the playback dynamic memory reclamation method provided in some embodiments of the present application, when data is added to the buffer queue of the display device 200, the release position of the media data is determined by detecting the data frame type of the media data in the buffer queue, so as to release the media data that has been read in the buffer queue, and reduce the memory occupation of the buffer queue. As shown in fig. 15, after the read data of the buffer queue is released by the reclamation method provided in the embodiment of the present application, only the unread data portion of the display device 200 is reserved for the media data buffered in the buffer queue, and compared with the memory reclamation method shown in fig. 7, the reclamation method provided in the embodiment of the present application occupies less memory, and has better playing effect.
In addition, the display device 200 needs to return the buffered buffering time range of the media asset data to the media asset playing application while playing the media asset data. Therefore, the time range of the media data buffered in the buffer queue also affects the playing process of the media data. As shown in fig. 16, when the display device 200 does not release the media data in the buffer queue, the returned buffer time range is the time range in fig. 16, and the time range includes the entire time ranges of the played data, the read data, and the unread data.
Obviously, after the display device 200 releases the media data in the buffer queue, the returned buffer time range is an invalid value, which may cause the application program of the display device 200 to report errors. Therefore, in order to improve the problem of the application program error, in some embodiments, the display device 200 further monitors the playing progress and the buffering time point of the media data. The buffer time point is a time point corresponding to media data with the latest buffer time in the buffer queue. And obtaining the playing time point of the media data according to the playing progress, and outputting the buffering time range according to the playing time point and the buffering time point.
That is, after releasing the media data in the buffer queue, the display device 200 determines the buffer time range according to the time point corresponding to the currently played data frame and the time point corresponding to the media data with the latest buffer time in the buffer queue, and adjusts the returned buffer time range in real time. Thus, the buffer time range output by the display device 200 is always kept as a valid value, so as to improve the problem of error reporting of the application program.
In some embodiments, when the display apparatus 200 outputs the buffer time range according to the play time point and the buffer time point, the play time point is set to the left value of the buffer time range, and the buffer time point is set to the right value of the buffer time range, and the buffer time range is returned.
For example: after releasing the media asset data, the display device 200 acquires the position of the play time cuurentTime of the current media asset data. The value at the time point of the cuurentTime position is taken as the left value of the buffer time range. And taking the time point value corresponding to the last frame of data frame in the buffer queue as the right value of the buffer time range. As shown in fig. 17, the buffer time range returned by the display device 200 to the application program is the time range shown in fig. 17, the left value of the buffer time range is cuurentTime, and the right value is the latest time point in the unread data.
Furthermore, when the buffer queue no longer supplements data, the display device 200 also no longer releases the media data of the buffer queue. Since the application of the display device 200 may also actively perform memory reclamation on the media data in the buffer queue, in some embodiments, the display device 200 further receives a data reclamation instruction for deleting data, and detects a data frame type of the target data in response to the data reclamation instruction. That is, when the media playback application actively deletes the media data in the buffer queue, the display device 200 may actively release the read data in the buffer queue in response to the data reclamation instruction.
Similarly, if the data frame type is an intra-frame coding frame, which indicates that the read data in the buffer queue can be completely released, the media data in the buffer queue earlier than the target time point is released. If the data frame type is a forward predictive coded frame, it is indicated that the read data in the buffer queue may not be fully released. If the media data earlier than the corresponding time point of the forward predictive encoded frame is released entirely, an error may occur in the decoding performed by the display device 200. Accordingly, the display device 200 queries the buffer queue for the intra-coded frame earlier than the target time point, and releases the media data in the buffer queue earlier than the corresponding time point of the intra-coded frame.
Based on the above playing dynamic memory recycling method, some embodiments of the application further provide a display device 200, as shown in fig. 18, including a display 260 and a controller 250. Wherein the display 260 is configured to display a user interface; as shown in fig. 8, the controller 250 is configured to perform the following program steps:
s100: responding to a playing instruction of the media data, reading the media data of the buffer queue, and recording a target time point of reading target data; the buffer queue comprises cached media data, and the target data is media data with latest reading time point;
s200: detecting the data frame type of the target data when the buffer queue adds the data;
s300: if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
s400: and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
According to the technical scheme, the display device and the playing dynamic memory recycling method provided by some embodiments of the present application can respond to the playing instruction of the media data, read the media data of the buffer queue, and record the target time point of reading the target data. The buffer queue comprises cached media data, and the target data is media data with the latest reading time point. When the buffer queue adds data, the data frame type of the target data is detected. If the data frame type is an intra-frame coding frame, releasing media data earlier than a target time point in a buffer queue; if the data frame type is the forward predictive coding frame, inquiring the intra-frame coding frame earlier than the target time point in the buffer queue, and releasing the media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue. According to the method, the release position is determined through the currently read media data, the media data of the buffer queue is recovered according to the release position, the memory occupation in the playing process can be reduced, and the efficiency of playing the media data by the display equipment is improved.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limited thereto; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising:
a display configured to display a user interface;
A controller configured to:
responding to a playing instruction of the media data, reading the media data of the buffer queue, and recording a target time point of reading target data; the buffer queue comprises cached media data, and the target data is media data with latest reading time point;
detecting the data frame type of the target data when the buffer queue adds the data;
if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
2. The display device of claim 1, wherein the controller executing the read buffer queue of media data is further configured to:
downloading the media asset data according to a network transmission protocol of the media asset data;
caching the media asset data into the buffer queue;
writing the media data of the buffer queue into a sharing module based on a rendering process of a browser, wherein the sharing module is a storage module with preset memory capacity;
And reading the media data of the sharing module through a main process of the browser.
3. The display device of claim 2, wherein the controller is further configured to:
transferring the media data read by the main process to a middleware layer to write the media data into a playing platform of a browser;
monitoring the data volume of media data in the playing platform;
and when the data quantity is greater than or equal to the data quantity threshold, sending the media asset data to a decoding queue, and decoding the media asset data of the decoding queue.
4. The display device of claim 1, wherein the controller is configured to:
monitoring the playing progress of the media asset data and a buffering time point, wherein the buffering time point is a time point corresponding to media asset data with the latest buffering time in the buffering queue;
acquiring a playing time point of the media asset data according to the playing progress;
and outputting a buffer time range according to the play time point and the buffer time point.
5. The display device according to claim 4, wherein the controller performs outputting a buffering time range in accordance with the play time point and the buffering time point, and is configured to:
Setting the play time point as a left value of the buffering time range, and setting the buffering time point as a right value of the buffering time range;
and returning to the buffering time range.
6. The display device of claim 1, wherein the controller executing release of media data in the buffer queue earlier than the target point in time is configured to:
inquiring the target position of the target time point in the buffer queue; the media data in the buffer queue are arranged according to the data time points of the media data, and the data time points at the rear position of the buffer queue are earlier than the data time points at the front position of the buffer queue;
and releasing the media data positioned at the rear part of the target position in the buffer queue.
7. The display device of claim 1, wherein the controller executing the query for intra-coded frames in the buffer queue that are earlier than the target point in time is configured to:
inquiring the target position of the target time point in the buffer queue; the media data in the buffer queue are arranged according to the data time points of the media data, and the data time points at the rear position of the buffer queue are earlier than the data time points at the front position of the buffer queue;
Inquiring the read data positioned at the rear part of the target position in the buffer queue;
detecting the type of the coded frame of the read data;
and marking the intra-frame coding frame closest to the target position in the read data as a recovery frame, wherein the recovery frame is used for determining the release position of the read data.
8. The display device of claim 7, wherein the controller executing release of media data in the buffer queue earlier than the corresponding point in time of the intra-coded frame is configured to:
acquiring the recovery frame position of the recovery frame in the buffer queue;
and releasing the read data positioned at the rear part of the recycling frame position in the buffer queue.
9. The display device of claim 1, wherein the controller is further configured to:
receiving a data reclamation instruction for deleting data;
detecting a data frame type of the target data in response to the data reclamation instruction;
if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
10. A method for recovering playing dynamic memory is characterized by comprising the following steps:
responding to a playing instruction of the media data, reading the media data of the buffer queue, and recording a target time point of reading target data; the buffer queue comprises cached media data, and the target data is media data with latest reading time point;
detecting the data frame type of the target data when the buffer queue adds the data;
if the data frame type is an intra-frame coding frame, releasing media data which is earlier than the target time point in the buffer queue;
and if the data frame type is a forward predictive coding frame, inquiring an intra-frame coding frame earlier than the target time point in the buffer queue, and releasing media data earlier than the corresponding time point of the intra-frame coding frame in the buffer queue.
CN202310532841.6A 2023-05-11 2023-05-11 Display equipment and playing dynamic memory recycling method Pending CN117812362A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310532841.6A CN117812362A (en) 2023-05-11 2023-05-11 Display equipment and playing dynamic memory recycling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310532841.6A CN117812362A (en) 2023-05-11 2023-05-11 Display equipment and playing dynamic memory recycling method

Publications (1)

Publication Number Publication Date
CN117812362A true CN117812362A (en) 2024-04-02

Family

ID=90428634

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310532841.6A Pending CN117812362A (en) 2023-05-11 2023-05-11 Display equipment and playing dynamic memory recycling method

Country Status (1)

Country Link
CN (1) CN117812362A (en)

Similar Documents

Publication Publication Date Title
EP1769318B1 (en) Client-Server Architectures and Methods for a Zoomable User Interface
US9100716B2 (en) Augmenting client-server architectures and methods with personal computers to support media applications
WO2010008230A2 (en) Apparatus and method for providing user interface service in a multimedia system
KR20150096440A (en) Distributed cross-platform user interface and application projection
WO2022257699A1 (en) Image picture display method and apparatus, device, storage medium and program product
WO2019164753A1 (en) Efficient streaming video for static video content
US11528523B2 (en) Method and system to share a snapshot extracted from a video transmission
WO2021233123A1 (en) Video processing method and apparatus, and computer device and storage medium
US9055272B2 (en) Moving image reproduction apparatus, information processing apparatus, and moving image reproduction method
US9226003B2 (en) Method for transmitting video signals from an application on a server over an IP network to a client device
CN117812362A (en) Display equipment and playing dynamic memory recycling method
KR20140117889A (en) Client apparatus, server apparatus, multimedia redirection system and the method thereof
CN113709574B (en) Video screenshot method and device, electronic equipment and computer readable storage medium
CN115278323A (en) Display device, intelligent device and data processing method
CN112836158A (en) Resource loading method on display equipment and display equipment
CN113453069A (en) Display device and thumbnail generation method
CN115174991B (en) Display equipment and video playing method
CN115589450B (en) Video recording method and device
JP2006339980A (en) Image reproducer
CN112601107B (en) Method for synchronizing historical records in abnormal scene and display device
CN117834983A (en) Playing buffer determining method and display device
CN117119234A (en) Display equipment and media asset playing method
CN116347091A (en) Content display method and display equipment
CN117812341A (en) Display equipment and media asset playing method
CN116939295A (en) Display equipment and method for dynamically adjusting utilization rate of controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination