CN111966646B - File caching method and display device - Google Patents

File caching method and display device Download PDF

Info

Publication number
CN111966646B
CN111966646B CN202010825754.6A CN202010825754A CN111966646B CN 111966646 B CN111966646 B CN 111966646B CN 202010825754 A CN202010825754 A CN 202010825754A CN 111966646 B CN111966646 B CN 111966646B
Authority
CN
China
Prior art keywords
picture
buffer
area
cached
buffered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010825754.6A
Other languages
Chinese (zh)
Other versions
CN111966646A (en
Inventor
张小涛
苏童
武兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vidaa Netherlands International Holdings BV
Original Assignee
Vidaa Netherlands International Holdings BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vidaa Netherlands International Holdings BV filed Critical Vidaa Netherlands International Holdings BV
Priority to CN202010825754.6A priority Critical patent/CN111966646B/en
Publication of CN111966646A publication Critical patent/CN111966646A/en
Application granted granted Critical
Publication of CN111966646B publication Critical patent/CN111966646B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/10File systems; File servers
    • G06F16/17Details of further file system functions
    • G06F16/172Caching, prefetching or hoarding of files

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application provides a file caching method and display equipment, which are used for carrying out hierarchical recycling according to the caching priority of cached pictures when the occupied caching area in a caching area is recycled. The display device includes: a display; a controller coupled to the display, the controller configured to: when the pictures in the user interface displayed through the browser are required to be cached, determining a caching priority value corresponding to the pictures to be cached; if the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area; and if the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority to which the buffered picture in the buffer area belongs, so as to buffer the picture to be buffered through the recovered free area.

Description

File caching method and display device
Technical Field
The present application relates to a caching technology, and in particular, to a file caching method and a display device.
Background
Currently, when a picture cached in a cache region reaches the size of the cache region, the occupied cache region in the cache region needs to be recovered. The current common recycling strategy is to recycle the cache area occupied by each picture according to the cache time length of each picture.
However, depending on the buffer duration of each picture to recover the buffer area, some pictures with lower update frequency or larger file volume may be recovered in advance relative to pictures with higher update frequency or smaller file volume.
Disclosure of Invention
The application provides a file caching method and display equipment, which are used for carrying out hierarchical recycling according to the caching priority of cached pictures when the occupied caching area in a caching area is recycled.
The technical scheme provided by the application comprises the following steps:
According to a first aspect of the present application, there is provided a display device comprising:
A display;
A controller coupled with the display, the controller configured to perform:
when the pictures in the user interface displayed through the browser are required to be cached, determining a caching priority value corresponding to the pictures to be cached;
If the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area;
if the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority of the buffered picture in the buffer area, so as to buffer the picture to be buffered through the recovered free area; and the buffer priority is determined based on the buffer priority value corresponding to the buffered picture.
According to a second aspect of the present application, there is provided a file caching method, applied to a display device, including:
when the pictures in the user interface displayed through the browser are required to be cached, determining a caching priority value corresponding to the pictures to be cached;
If the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area;
if the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority of the buffered picture in the buffer area, so as to buffer the picture to be buffered through the recovered free area; and the buffer priority is determined based on the buffer priority value corresponding to the buffered picture.
According to the technical scheme, the pictures to be cached, which correspond to the cache priority values, are cached in the cache area corresponding to the cache priority values in a unified manner, so that when the occupied cache area in the cache area needs to be recovered, the cached time of each cached picture in the cache area is not relied on for recovery, but the cached pictures which belong to the cached pictures are recycled in a grading manner, and the problem that the cached pictures which are cached in the cache area are recovered in advance before the cached pictures which need to be recovered in time is avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
A schematic diagram of an operational scenario between a display device and a control apparatus according to some embodiments is schematically shown in fig. 1;
a hardware configuration block diagram of a display device 200 according to some embodiments is exemplarily shown in fig. 2;
A hardware configuration block diagram of the control device 100 according to some embodiments is exemplarily shown in fig. 3;
a schematic diagram of the software configuration in a display device 200 according to some embodiments is exemplarily shown in fig. 4;
An icon control interface display schematic of an application in a display device 200 according to some embodiments is illustrated in fig. 5;
a flow chart of a file caching method according to some embodiments is exemplarily shown in fig. 6;
a flowchart of an implementation of step 104 according to some embodiments is shown schematically in fig. 7.
Detailed Description
For the purposes of making the objects, embodiments and advantages of the present application more apparent, an exemplary embodiment of the present application will be described more fully hereinafter with reference to the accompanying drawings in which exemplary embodiments of the application are shown, it being understood that the exemplary embodiments described are merely some, but not all, of the examples of the application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the appended claims. Furthermore, while the present disclosure has been described in terms of an exemplary embodiment or embodiments, it should be understood that each aspect of the disclosure can be practiced separately from the other aspects.
It should be noted that the brief description of the terminology in the present application is for the purpose of facilitating understanding of the embodiments described below only and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms first, second, third and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this disclosure refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used herein refers to a component of an electronic device (such as a display device as disclosed herein) that can be controlled wirelessly, typically over a relatively short distance. Typically, the electronic device is connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WiFi, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user behavior by which a user expresses an intended idea, action, purpose, and/or result through a change in hand shape or movement of a hand, etc.
A schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment is exemplarily shown in fig. 1. As shown in fig. 1, a user may operate the display apparatus 200 through the mobile terminal 300 and the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes infrared protocol communication or bluetooth protocol communication, and other short-range communication modes, etc., and the display device 200 is controlled by a wireless or other wired mode. The user may control the display device 200 by inputting user instructions through keys on a remote control, voice input, control panel input, etc. Such as: the user can input corresponding control instructions through volume up-down keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, on-off keys, etc. on the remote controller to realize the functions of the control display device 200.
In some embodiments, mobile terminals, tablet computers, notebook computers, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application running on a smart device. The application program, by configuration, can provide various controls to the user in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 300 may install a software application with the display device 200, implement connection communication through a network communication protocol, and achieve the purpose of one-to-one control operation and data communication. Such as: it is possible to implement a control command protocol established between the mobile terminal 300 and the display device 200, synchronize a remote control keyboard to the mobile terminal 300, and implement a function of controlling the display device 200 by controlling a user interface on the mobile terminal 300. The audio/video content displayed on the mobile terminal 300 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
A hardware configuration block diagram of the display device 200 according to an exemplary embodiment is illustrated in fig. 2.
In some embodiments, at least one of the controller 250, the modem 210, the communicator 220, the detector 230, the input/output interface 255, the display 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via a wired or wireless communication protocol. Or may display various image content received from a network communication protocol from a network server side.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and the external control device 100 or the content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from the control device 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to realize an interaction function with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display device 200 may adaptively adjust the display color temperature of the image. The display device 200 may be adjusted to display a colder color temperature shade of the image, such as when the temperature is higher, or the display device 200 may be adjusted to display a warmer color shade of the image when the temperature is lower.
In some embodiments, the detector 230 may also be a sound collector or the like, such as a microphone, that may be used to receive the user's sound. Illustratively, a voice signal including a control instruction for a user to control the display apparatus 200, or an acquisition environmental sound is used to recognize an environmental scene type so that the display apparatus 200 can adapt to environmental noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, command instruction data, or the like.
In some embodiments, external device interface 240 may include, but is not limited to, the following: any one or more interfaces of a high definition multimedia interface HDMI interface, an analog or data high definition component input interface, a composite video input interface, a USB input interface, an RGB port, and the like can be used. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the modem 210 is configured to receive the broadcast television signal by a wired or wireless receiving manner, and may perform modulation and demodulation processes such as amplification, mixing, and resonance, and demodulate the audio/video signal from a plurality of wireless or wired broadcast television signals, where the audio/video signal may include a television audio/video signal carried in a television channel frequency selected by a user, and an EPG data signal.
In some embodiments, the frequency point demodulated by the modem 210 is controlled by the controller 250, and the controller 250 may send a control signal according to the user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to a broadcasting system of the television signal. Or may be differentiated into digital modulation signals, analog modulation signals, etc., depending on the type of modulation. Or it may be classified into digital signals, analog signals, etc. according to the kind of signals.
In some embodiments, the controller 250 and the modem 210 may be located in separate devices, i.e., the modem 210 may also be located in an external device to the main device in which the controller 250 is located, such as an external set-top box or the like. In this way, the set-top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: operations to connect to a hyperlink page, document, image, etc., or operations to execute a program corresponding to an icon are displayed. The user command for selecting the UI object may be an input command through various input means (e.g., mouse, keyboard, touch pad, etc.) connected to the display device 200 or a voice command corresponding to a voice uttered by the user.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a central processing unit 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphical objects, such as: icons, operation menus, user input instruction display graphics, and the like. The device comprises an arithmetic unit, wherein the arithmetic unit is used for receiving various interaction instructions input by a user to carry out operation and displaying various objects according to display attributes. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image compositing module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio/video data stream, such as the input MPEG-2, and demultiplexes the input audio/video data stream into video signals, audio signals and the like.
And the video decoding module is used for processing the demultiplexed video signals, including decoding, scaling and the like.
And an image synthesis module, such as an image synthesizer, for performing superposition mixing processing on the graphic generator and the video image after the scaling processing according to the GUI signal input by the user or generated by the graphic generator, so as to generate an image signal for display.
The frame rate conversion module is configured to convert the input video frame rate, for example, converting the 60Hz frame rate into the 120Hz frame rate or the 240Hz frame rate, and the common format is implemented in an inserting frame manner.
The display format module is used for converting the received frame rate into a video output signal, and changing the video output signal to a signal conforming to the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+frc (FRAME RATE Conversion) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives sound signals output by the audio processor 280, such as: the speaker 286, and an external sound output terminal that can be output to a generating device of an external device, other than the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc. can also include the close range communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
In some embodiments, a user inputs a user command through the control apparatus 100 or the mobile terminal 300, the user input interface is then responsive to the user input through the controller 250, and the display device 200 is then responsive to the user input.
In some embodiments, a user may input a user command through a Graphical User Interface (GUI) displayed on the display 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Or the user may input the user command by inputting a specific sound or gesture, the user input interface recognizes the sound or gesture through the sensor, and receives the user input command.
In some embodiments, a "user interface" is a media interface for interaction and exchange of information between an application or operating system and a user that enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of a user interface is a graphical user interface (Graphic User Interface, GUI), which refers to a graphically displayed user interface that is related to computer operations. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a base module, a detection module, a communication module, a display control module, a browser module, various service modules, and the like.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
For example, the voice recognition module includes a voice analysis module and a voice instruction database module. The display control module is used for controlling the display to display the image content, and can be used for playing the multimedia image content, the UI interface and other information. And the communication module is used for carrying out control and data communication with external equipment. And the browser module is used for executing data communication between the browsing servers. And the service module is used for providing various services and various application programs. Meanwhile, the memory 260 also stores received external data and user data, images of various items in various user interfaces, visual effect maps of focus objects, and the like.
Fig. 3 exemplarily shows a block diagram of a configuration of the control apparatus 100 in accordance with an exemplary embodiment. As shown in fig. 3, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface, a memory, and a power supply.
The control device 100 is configured to control the display device 200, and may receive an input operation instruction of a user, and convert the operation instruction into an instruction recognizable and responsive to the display device 200, to function as an interaction between the user and the display device 200. Such as: the user responds to the channel addition and subtraction operation by operating the channel addition and subtraction key on the control apparatus 100, and the display apparatus 200.
In some embodiments, the control device 100 may be a smart device. Such as: the control apparatus 100 may install various applications for controlling the display apparatus 200 according to user's needs.
In some embodiments, as shown in fig. 1, a mobile terminal 300 or other intelligent electronic device may function similarly to the control device 100 after installing an application that manipulates the display device 200. Such as: the user may implement the functions of controlling the physical keys of the device 100 by installing various function keys or virtual buttons of a graphical user interface available on the mobile terminal 300 or other intelligent electronic device.
The controller 110 includes a processor 112 and RAM 113 and ROM 114, a communication interface 130, and a communication bus. The controller is used to control the operation and operation of the control device 100, as well as the communication collaboration among the internal components and the external and internal data processing functions.
The communication interface 130 enables communication of control signals and data signals with the display device 200 under the control of the controller 110. Such as: the received user input signal is transmitted to the display device 200. The communication interface 130 may include at least one of a WiFi chip 131, a bluetooth module 132, an NFC module 133, and other near field communication modules.
A user input/output interface 140, wherein the input interface includes at least one of a microphone 141, a touchpad 142, a sensor 143, keys 144, and other input interfaces. Such as: the user can implement a user instruction input function through actions such as voice, touch, gesture, press, and the like, and the input interface converts a received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the corresponding instruction signal to the display device 200.
The output interface includes an interface that transmits the received user instruction to the display device 200. In some embodiments, an infrared interface may be used, as well as a radio frequency interface. Such as: when the infrared signal interface is used, the user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the display device 200 through the infrared sending module. And the following steps: when the radio frequency signal interface is used, the user input instruction is converted into a digital signal, and then the digital signal is modulated according to a radio frequency control signal modulation protocol and then transmitted to the display device 200 through the radio frequency transmission terminal.
In some embodiments, the control device 100 includes at least one of a communication interface 130 and an input-output interface 140. The control device 100 is provided with a communication interface 130 such as: the WiFi, bluetooth, NFC, etc. modules may send the user input instruction to the display device 200 through a WiFi protocol, or a bluetooth protocol, or an NFC protocol code.
A memory 190 for storing various operation programs, data and applications for driving and controlling the control device 200 under the control of the controller. The memory 190 may store various control signal instructions input by a user.
A power supply 180 for providing operating power support for the various elements of the control device 100 under the control of the controller. May be a battery and associated control circuitry.
In some embodiments, the system may include a Kernel (Kernel), a command parser (shell), a file system, and an application. The kernel, shell, and file system together form the basic operating system architecture that allows users to manage files, run programs, and use the system. After power-up, the kernel is started, the kernel space is activated, hardware is abstracted, hardware parameters are initialized, virtual memory, a scheduler, signal and inter-process communication (IPC) are operated and maintained. After the kernel is started, shell and user application programs are loaded again. The application program is compiled into machine code after being started to form a process.
Referring to FIG. 4, in some embodiments, the system is divided into four layers, from top to bottom, an application layer (referred to as an "application layer"), an application framework layer (Application Framework) layer (referred to as a "framework layer"), a An Zhuoyun row layer (Android runtime) and a system library layer (referred to as a "system runtime layer"), and a kernel layer, respectively.
In some embodiments, at least one application program is running in the application program layer, and these application programs may be a Window (Window) program of an operating system, a system setting program, a clock program, a camera application, and the like; and may be an application program developed by a third party developer, such as a hi-see program, a K-song program, a magic mirror program, etc. In particular implementations, the application packages in the application layer are not limited to the above examples, and may actually include other application packages, which the embodiments of the present application do not limit.
The framework layer provides an application programming interface (application programming interface, API) and programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions. The application framework layer corresponds to a processing center that decides to let the applications in the application layer act. An application program can access resources in a system and acquire services of the system in execution through an API interface
As shown in fig. 4, the application framework layer in the embodiment of the present application includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an activity manager (ACTIVITY MANAGER) is used to interact with all activities running in the system; a Location Manager (Location Manager) is used to provide system services or applications with access to system Location services; a package manager (PACKAGE MANAGER) for retrieving various information about the application packages currently installed on the device; a notification manager (Notification Manager) for controlling the display and clearing of notification messages; a Window Manager (Window Manager) is used to manage bracketing icons, windows, toolbars, wallpaper, and desktop components on the user interface.
In some embodiments, the activity manager is to: the lifecycle of each application program is managed, as well as the usual navigation rollback functions, such as controlling the exit of the application program (including switching the currently displayed user interface in the display window to the system desktop), opening, backing (including switching the currently displayed user interface in the display window to the previous user interface of the currently displayed user interface), etc.
In some embodiments, the window manager is configured to manage all window procedures, such as obtaining a display screen size, determining whether there is a status bar, locking the screen, intercepting the screen, controlling display window changes (e.g., scaling the display window down, dithering, distorting, etc.), and so on.
In some embodiments, the system runtime layer provides support for the upper layer, the framework layer, and when the framework layer is in use, the android operating system runs the C/C++ libraries contained in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the kernel layer contains at least one of the following drivers: audio drive, display drive, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (e.g., fingerprint sensor, temperature sensor, touch sensor, pressure sensor, etc.), and the like.
In some embodiments, the kernel layer further includes a power driver module for power management.
In some embodiments, the software programs and/or modules corresponding to the software architecture in fig. 4 are stored in the first memory or the second memory shown in fig. 2 or fig. 3.
In some embodiments, taking a magic mirror application (photographing application) as an example, when the remote control receiving device receives an input operation of the remote control, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the input operation into the original input event (including the value of the input operation, the timestamp of the input operation, etc.). The original input event is stored at the kernel layer. The application program framework layer acquires an original input event from the kernel layer, identifies a control corresponding to the input event according to the current position of the focus and takes the input operation as a confirmation operation, wherein the control corresponding to the confirmation operation is a control of a magic mirror application icon, the magic mirror application calls an interface of the application framework layer, the magic mirror application is started, and further, a camera driver is started by calling the kernel layer, so that a still image or video is captured through a camera.
In some embodiments, for a display device with a touch function, taking a split screen operation as an example, the display device receives an input operation (such as a split screen operation) acted on a display screen by a user, and the kernel layer may generate a corresponding input event according to the input operation and report the event to the application framework layer. The window mode (e.g., multi-window mode) and window position and size corresponding to the input operation are set by the activity manager of the application framework layer. And window management of the application framework layer draws a window according to the setting of the activity manager, then the drawn window data is sent to a display driver of the kernel layer, and the display driver displays application interfaces corresponding to the window data in different display areas of the display screen.
In some embodiments, as shown in fig. 5, the application layer contains at least one icon control that the application can display in the display, such as: a live television application icon control, a video on demand application icon control, a media center application icon control, an application center icon control, a game application icon control, and the like.
In some embodiments, the live television application may provide live television via different signal sources. For example, a live television application may provide television signals using inputs from cable television, radio broadcast, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on the display device 200.
In some embodiments, the video on demand application may provide video from different storage sources. Unlike live television applications, video-on-demand provides video displays from some storage sources. For example, video-on-demand may come from the server side of cloud storage, from a local hard disk storage containing stored video programs.
In some embodiments, the media center application may provide various multimedia content playing applications. For example, a media center may be a different service than live television or video on demand, and a user may access various images or audio through a media center application.
In some embodiments, an application center may be provided to store various applications. The application may be a game, an application, or some other application associated with a computer system or other device but which may be run in a smart television. The application center may obtain these applications from different sources, store them in local storage, and then be run on the display device 200.
As described above, a display device includes: a display, a controller coupled to the display. In an embodiment of the application, the controller is configured to perform: when the pictures in the user interface displayed through the browser are required to be cached, determining a caching priority value corresponding to the pictures to be cached; if the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area; if the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority of the buffered picture in the buffer area, so as to buffer the picture to be buffered through the recovered free area; the buffer priority is determined based on a buffer priority value corresponding to the buffered picture.
Correspondingly, the embodiment also provides a method for realizing the corresponding operation of the controller. Fig. 6 is a schematic flow chart of a file caching method according to an embodiment of the present application.
As shown in fig. 6, the process may include the steps of:
Step 101, when buffering is required for a picture in a user interface displayed through a browser, determining a buffering priority value corresponding to the picture to be buffered.
In an example, before executing the step 101, a corresponding buffer priority value may be set for the picture to be buffered in advance according to factors such as a volume size and an update frequency of the picture to be buffered in the user interface. Here, the buffer priority value may be used to indicate that the corresponding picture to be buffered is a picture to be buffered that needs to be buffered in the buffer. In addition, the buffer priority value can also be used for representing the buffer priority corresponding to the picture to be buffered.
Alternatively, each buffer priority value may be set to correspond to a buffer priority. For example, the buffer priority corresponding to the buffer priority value-1 is a first priority, and the buffer priority corresponding to the buffer priority value 0 is a second priority.
Alternatively, a buffer priority value interval may be set to correspond to a buffer priority. For example, the buffer priority corresponding to the buffer priority value interval 0-10 is the first priority, and the buffer priority corresponding to the buffer priority value interval 11-20 is the second priority. As to what operation is performed according to the cache priority and the specific meaning of the cache priority, the following step 102 will be described in detail, which is omitted here.
It should be noted that, the corresponding relationship and the specific numerical value between the buffer priority value and the buffer priority may be set according to the actual situation, and the present application is not limited in particular.
As an example, the user interface may be a web page that is displayed by a browser. Based on this, determining the buffer priority value corresponding to the picture to be buffered may be implemented by analyzing the web page code, for example: analyzing a cache priority value from an object code corresponding to a picture to be cached; and the target code is used for displaying the picture to be cached in the user interface when being executed by the browser.
In one example, the object code may be written in hypertext markup language (Hyper Text Markup Language, HTML). For example, when the picture to be cached is a picture (a background picture other than a certain element) embedded in the web page, the object code corresponding to the picture to be cached may be: < img src= "/img/test.jpg alt=" test picture "cache-id=" img-test "cache-priority=" -1">. Wherein img is an HTML tag for embedding a picture into a web page. src is an HTML attribute, and the value of src/img/test.jpg is the file path of the embedded picture, i.e. the first file path of the picture to be cached. alt is also an HTML attribute, the value of alt specifying the alternate text of the embedded picture. The cache-priority is an HTML attribute additionally constructed based on the existing HTML attribute, where the value of the cache-priority attribute is the cache priority value of the embedded picture. The cache-id is also an HTML attribute additionally constructed on the basis of the existing attribute of the HTML, and is used for uniquely identifying a picture to be cached.
For another example, when the picture to be cached is a background picture of an element (marked as a target element) in the web page, the target code corresponding to the picture to be cached may be :<div style="position:absolute;background-image:url(input_icon.webp);cache-id="img-test"cache-priority="-1"">Input</div>., where div is an HTML tag, style is an attribute in the div tag, and a value position of style attribute is that the absolute specifies that the background picture occupies the entire size of the element, including the inner margin and the border, but not including the outer margin. The background-image is an HTML attribute, where the background-image attribute is used to set a background image for an element, and the value of the background-image attribute is a URL address, and the URL address input_icon. The cache-priority is an HTML attribute additionally constructed based on the existing HTML attribute, where the value of the cache-priority attribute is the cache priority value of the embedded picture. The cache-id is also an HTML attribute additionally constructed on the basis of the existing attribute of the HTML, and is used for uniquely identifying a picture to be cached.
In another example, the object code described above may also be written in Flutter language. When the object code is written in Flutter languages, as an example, when the picture to be cached is a picture for display in a web page, the object code corresponding to the picture to be cached may be:
In the object code written in Flutter language, the meaning of the cache-priority and the cache-id is the same as the meaning of the cache-priority and the cache-id in the HTML language, and will not be described in detail here.
It should be noted that, the above-mentioned cache-id is not an attribute necessary for the code implementation process, and whether the code contains the cache-id does not affect the execution of the above-mentioned step 101.
It should be further noted that, the above object code may be written in a corresponding program language according to the actual situation, for example, hypertext markup language (Hyper Text Markup Language, HTML), cascading style sheet (CASCADING STYLE SHEETS, CSS), extensible markup language (Extensible Markup Language, XML), QML or Flutter, and the cache priority value of the picture to be cached may be written in different program languages in the same or different writing manners, which only needs to ensure that a compiler built in the browser can parse the cache priority value from the program code corresponding to the cache priority value.
Step 102, checking whether an idle area in a buffer area corresponding to the buffer priority value is larger than or equal to a storage area required by a picture to be buffered; if yes, go to step 103; if not, go to step 104.
In one example, different cache priority values may correspond to different cache regions. I.e. different buffer areas are used for buffering pictures to be buffered corresponding to different buffer priority values. In another example, different cache priority values may correspond to the same cache region. That is, the pictures to be cached corresponding to different cache priority values are uniformly stored in one cache area. If not explicitly stated, the embodiments of the present application are described below in terms of implementation manners in which different cache priority values correspond to the same cache region.
When the picture to be cached is stored, whether the free area in the cache area corresponding to the caching priority value is larger than or equal to the storage area required by the picture to be cached is checked, and whether the picture to be cached can be stored is judged.
In one example, checking whether the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, all the free areas in the buffer area may be determined as free areas for comparison. Of course, in another example, the longest continuous free area in the buffer may also be determined as a free area for comparison.
And step 103, caching the picture to be cached in the idle area in the cache area.
The step 103 is executed on the premise of checking that the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered.
If the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, the buffer area can still buffer the picture to be buffered even if the buffer area is not recycled currently. Based on the above, the picture to be cached can be directly cached to the idle area in the cache region.
Step 104, recovering occupied areas from the buffer area according to the buffer priority to which the buffered pictures in the buffer area belong, so as to buffer the pictures to be buffered in the idle areas obtained by recovery; the buffer priority is determined based on a buffer priority value corresponding to the buffered picture.
The step 104 is performed on the premise that it is checked that the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered.
If the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, the free area of the buffer area indicates that the picture to be buffered cannot be buffered, and the occupied area in the buffer area needs to be recovered.
When the occupied cache area in the cache area is recovered, the area occupied by each picture to be cached can be recovered according to the cache priority to which each cached picture in the cache area belongs.
In some embodiments of the application, a buffer priority may be used to represent the order of reclamation of buffered pictures. For example, the lower the buffer priority, the earlier the corresponding buffered picture is recovered when the buffer area is recovered; the higher the buffer priority, the later the corresponding buffered picture is retrieved when the buffer area is retrieved. In the following, with reference to the flow shown in fig. 7, how to recover the occupied area from the buffer according to the buffer priority to which the buffered picture in the buffer belongs will be described in detail, which is not repeated herein.
Thus, the flow shown in fig. 1 is completed.
According to the technical scheme, in the embodiment of the application, the pictures to be cached, which correspond to the cache priority values, are cached in the cache area corresponding to the cache priority values in a unified way, so that when the occupied cache area in the cache area needs to be recovered, the cached time of each cached picture in the cache area is not relied on for recovery, but the cached priority to which each cached picture belongs is relied on for classified recovery, and the problem that the cached pictures which are cached in the cache area are recovered in advance before the cached pictures which need to be recovered in time is avoided.
The following describes how to recycle the occupied area from the buffer according to the buffer priority to which the buffered pictures in the buffer belong in the step 104:
referring to fig. 7, a flowchart of an implementation of step 104 according to some embodiments is shown schematically in fig. 7.
As shown in fig. 7, the process may include the steps of:
Step 201, determining the lowest target buffer priority among the buffer priorities to which the buffered pictures in the current buffer area belong from the corresponding relationship between the buffered pictures and the buffer priorities.
As an example, the correspondence between the recorded cached pictures-the cache priority may be obtained by: when any picture to be cached is cached in the idle area in the above-mentioned cache area, determining the caching priority of the picture to be cached according to the caching priority value of the picture to be cached; and recording the picture to be cached and the corresponding relation between the caching priority to which the picture to be cached belongs.
In one example, a correspondence between a cache priority value and a cache priority is preconfigured in the display device. Based on this, when determining the buffer priority of the picture to be buffered according to the buffer priority value of the picture to be buffered, the buffer priority of the picture to be buffered may be determined according to the corresponding relationship between the buffer priority value and the buffer priority.
After determining the buffer priority of the picture to be buffered, in order to conveniently obtain the buffer priority of each buffered picture in the buffer area when the occupied area in the buffer area is recovered later, the corresponding relationship between the buffer priority of the picture to be buffered and the picture to be buffered can be recorded. Thus, after the picture to be cached is cached in the above-mentioned cache area, since the picture to be cached will become a cached picture, the recorded corresponding relationship corresponding to the picture to be cached will become a corresponding relationship between the cached picture and the cache priority. In practical application, the corresponding relation between the name of the picture to be cached and the caching priority to which the picture to be cached belongs can be recorded.
In this step 201, a target buffer priority with the lowest current buffer priority may be determined based on the corresponding relationship between the recorded buffered picture and the buffer priority.
For example, assume that a picture to be cached A1, a picture to be cached A2, and a picture to be cached A3 are already cached in the cache area. The buffer priority of the picture to be buffered A1 is a first priority, the buffer priority of the picture to be buffered A2 is a first priority, the buffer priority of the picture to be buffered A3 is a second priority, and then it can be determined that the target buffer priority is the second priority.
Step 202, checking whether the target cache priority is a preset highest cache priority; if not, go to step 203; if yes, go to step 204.
In some embodiments of the present application, the highest buffer priority may be distinguished from other levels of buffer priority, where the highest buffer priority is used to indicate that its corresponding picture to be buffered cannot be reclaimed. Based on this, by checking whether the target buffer priority is the preset highest buffer priority in step 202, it can be determined whether the current buffer has buffered pictures that can be reclaimed.
For example, assuming that the target priority is the second priority and the preset highest cache priority is the first priority, since the second priority is lower than the first priority, it may be determined that the target cache priority is not the preset highest cache priority.
Step 203, reclaiming the storage area occupied by at least one cached picture under the target cache priority, and returning to execute the foregoing step 102.
This step 203 is performed on the premise that the target cache priority is not the preset highest cache priority.
Because the target buffer priority is not the preset highest buffer priority, the buffered picture corresponding to the target buffer priority can be recovered.
Optionally, when the cached pictures under the target cache priority are retrieved, the cached pictures may be retrieved according to a preset retrieval number. Specifically, when the number of files of the cached pictures under the target cache priority is greater than or equal to the preset number, the cached pictures of the preset number can be recovered. When the number of the files of the cached pictures under the target cache priority is smaller than the preset number, all the cached pictures under the target cache priority can be recovered because the cached pictures with the preset number cannot be recovered.
As an example, when the cached pictures are retrieved according to the preset number of retrieval, the retrieved cached pictures may be selected by: and sequencing the cached pictures under the target cache priority according to the cache time length from long to short to obtain a time length sequencing sequence. And then, in the time length sorting sequence, selecting all cached pictures with sequence numbers not larger than the recycling quantity for recycling.
For example, assume that the cached picture A3 and the cached picture A4 are included under the target cache priority. The buffer time length of the buffered picture A3 is 5 minutes 20 seconds, the buffer time length of the buffered picture A4 is 3 minutes 40 seconds, and the buffer time length of the buffered picture A3 is longer than the buffer time length of the buffered picture A4, so that the obtained time length sorting sequence comprises the buffered picture A3 and the buffered picture A4, the sequence number of the buffered picture A3 is 1, and the sequence number of the buffered picture A4 is 2. Here, assuming that the preset reclamation amount is 1, the cached picture A3 with the sequence number of 1 may be selected from the obtained time-length ordered sequence to be reclaimed.
After the recovery of at least one cached picture under the target cache priority is completed, since it cannot be determined whether the free area in the recovered cache area is greater than or equal to the storage area required by the picture to be cached, the step 102 is required to return to check whether the recovered cache area can cache the picture to be cached.
Step 204, prohibit reclaiming the occupied area from the cache.
This step 204 is performed on the premise that the target cache priority is the preset highest cache priority.
Because the target cache priority is the preset highest cache priority, the cached picture corresponding to the target cache priority cannot be recovered. That is, it is necessary to maintain all buffered pictures at the target buffer priority level buffered in the buffer.
When the target buffer priority is the preset highest buffer priority, the highest buffer priority is used for indicating that the corresponding buffered pictures are not recovered, which indicates that no buffered pictures capable of being recovered exist in the buffer, in this case, recovery of the occupied buffer area in the buffer can be stopped, so as to maintain that all buffered pictures under the target buffer priority are buffered in the buffer.
It should be noted that, the cached pictures under the highest cache priority are not cached in the cache area in any case, for example, when the display device is restored to factory settings, the browser cache is manually cleared, or the cached pictures under the highest cache priority are manually deleted, the cached pictures under the highest cache priority are deleted from the cache area.
It should be further noted that, when the cache priority of the cached picture is the highest cache priority, since the cached picture is basically equivalent to the local file, the file may be identified according to the cache-id, and thus when the cached picture is used later, the use of the cached picture may be achieved through the cache-id of the cached picture.
The above description describes how to recycle the occupied area from the buffer according to the buffer priority to which the buffered picture in the buffer belongs in step 104.
In addition to the above-described method for caching and retrieving a picture to be cached, which corresponds to a caching priority value, the embodiment of the present application further provides a method for caching and retrieving a picture to be cached (recorded as a standard picture) which does not have a corresponding caching priority value.
How to cache and recycle standard pictures is described as follows:
when it is required to cache a standard picture (i.e., a picture to be cached that has no corresponding cache priority value) in a user interface displayed through a browser, it is checked whether an empty area in a cache area (denoted as a standard cache area) for caching only the standard picture is greater than or equal to a storage area required for the standard picture to be cached.
And when the free area in the standard buffer area is larger than or equal to the storage area required by the standard picture, caching the standard picture to be cached into the standard buffer area.
When the free area in the standard buffer area is smaller than the storage area required by the standard picture, recovering the occupied area in the standard buffer area according to the buffer information of the buffered picture in the standard buffer area.
In an example, the picture to be cached without the corresponding cache priority value may be a picture with an update frequency lower than a preset update frequency threshold in the user interface. Based on this, the picture to be cached corresponding to the cache priority value may be a picture with an update frequency higher than or equal to a preset update frequency threshold in the user interface. The preset update frequency threshold may be set according to practical situations, which is not particularly limited in the present application.
As an example, there are various implementations for reclaiming each cached standard picture according to the cached information of each standard picture.
As one implementation manner, when the buffer information is the buffer duration, the standard picture with the longest current buffer duration can be determined according to the buffer duration of each standard picture currently buffered in the standard buffer zone. And then deleting the standard picture with the longest caching duration in the standard cache region, and returning to a storage region required by checking whether the idle region in the standard cache region is larger than or equal to the standard picture to be cached. If not, the recovery of the cache area in the standard cache area can be stopped. And if so, returning to execute the step of determining the standard picture with the longest current caching duration according to the caching duration of each currently cached standard picture in the standard caching area.
As another implementation manner, when the buffer information is the file size of the standard picture, the standard picture with the smallest current file size can be determined according to the file size of each currently buffered standard picture in the standard buffer area. And then deleting the standard picture with the minimum file size in the standard cache region, and returning to a storage region required by checking whether the free region in the standard cache region is larger than or equal to the standard picture to be cached. If not, the recovery of the cache area in the standard cache area can be stopped. And if so, returning to execute the step of determining the standard picture with the minimum current file size according to the file size of each currently cached standard picture in the standard cache region.
It should be noted that, the cache information at least includes a cache duration, and other information may be set according to actual situations, and the present application does not limit the specific content of the cache information.
It should be noted that, the standard buffer is a separate buffer from the buffer corresponding to the buffer priority value.
How the standard pictures are cached and retrieved is described above.
The above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. The illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, characterized by comprising:
A display;
A controller coupled with the display, the controller configured to perform:
When the pictures in the user interface displayed through the browser are required to be cached, determining whether the pictures to be cached have corresponding caching priority values according to preset updating frequency; the pictures to be cached without the corresponding caching priority value refer to pictures with the updating frequency lower than the preset updating frequency threshold value in the user interface, and the pictures to be cached with the corresponding caching priority value refer to pictures with the updating frequency not lower than the preset updating frequency threshold value in the user interface;
When determining that the picture to be cached has a corresponding cache priority value, determining the cache priority value corresponding to the picture to be cached;
If the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area;
If the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority of the buffered picture in the buffer area until the free area in the buffer area corresponding to the buffer priority value after recovery is larger than or equal to the storage area required by the picture to be buffered, so as to buffer the picture to be buffered through the recovered free area; the buffer priority is determined based on a buffer priority value corresponding to the buffered pictures, and the buffer priority is used for representing the recovery sequence of the buffered pictures; wherein, different buffer priority values correspond to different buffer areas;
when determining that the picture to be cached does not have the corresponding cache priority value, if the free area in the cache area corresponding to the picture to be cached which does not have the corresponding cache priority value is greater than or equal to the storage area required by the picture to be cached which does not have the corresponding cache priority value, caching the picture to be cached which does not have the corresponding cache priority value to the cache area corresponding to the picture to be cached which does not have the corresponding cache priority value;
If the free area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value is smaller than the storage area required by the picture to be buffered which does not have the corresponding buffer priority value, recovering the occupied area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value according to the buffer information of the picture already buffered in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value until the free area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value after recovery is larger than or equal to the storage area required by the picture to be buffered which does not have the corresponding buffer priority value, so as to buffer the picture to be buffered through the free area;
and the buffer area corresponding to the buffer priority value and the buffer area corresponding to the picture to be buffered, in which the buffer priority value does not exist, are mutually independent buffer areas.
2. The display device of claim 1, wherein the controller is configured to perform:
Analyzing the buffer priority value from the object code corresponding to the picture to be buffered; and the target code is used for displaying the picture to be cached in the user interface when being executed by the browser.
3. The display device of claim 1, wherein the controller is configured to perform:
determining the lowest target cache priority among the cache priorities to which the cached pictures in the current cache region belong from the corresponding relation between the recorded cached pictures and the cache priorities;
And if the target cache priority is not the preset highest cache priority, recovering a storage area occupied by at least one cached picture under the target cache priority, and if the recovered free area is greater than or equal to the storage area required by the picture to be cached, caching the picture to be cached to the free area in the cache area.
4. A display device according to claim 3, wherein the controller is configured to perform:
and if the target cache priority is the preset highest cache priority, prohibiting the recovery of the occupied area from the cache area.
5. A display device according to claim 3, wherein the controller is configured to perform:
When the picture to be cached is cached to the idle area in the cache area, determining the caching priority of the picture to be cached according to the caching priority value of the picture to be cached;
And recording the corresponding relation between the picture to be cached and the caching priority to which the picture to be cached belongs.
6. A method for caching a file, applied to a display device, comprising:
When the pictures in the user interface displayed through the browser are required to be cached, determining whether the pictures to be cached have corresponding caching priority values according to preset updating frequency; the pictures to be cached without the corresponding caching priority value refer to pictures with the updating frequency lower than the preset updating frequency threshold value in the user interface, and the pictures to be cached with the corresponding caching priority value refer to pictures with the updating frequency not lower than the preset updating frequency threshold value in the user interface;
When determining that the picture to be cached has a corresponding cache priority value, determining the cache priority value corresponding to the picture to be cached;
If the free area in the buffer area corresponding to the buffer priority value is greater than or equal to the storage area required by the picture to be buffered, buffering the picture to be buffered into the free area in the buffer area;
If the free area in the buffer area corresponding to the buffer priority value is smaller than the storage area required by the picture to be buffered, recovering the occupied area from the buffer area according to the buffer priority of the buffered picture in the buffer area until the free area in the buffer area corresponding to the buffer priority value after recovery is larger than or equal to the storage area required by the picture to be buffered, so as to buffer the picture to be buffered through the recovered free area; the buffer priority is determined based on a buffer priority value corresponding to the buffered pictures, and the buffer priority is used for representing the recovery sequence of the buffered pictures; wherein, different buffer priority values correspond to different buffer areas;
when determining that the picture to be cached does not have the corresponding cache priority value, if the free area in the cache area corresponding to the picture to be cached which does not have the corresponding cache priority value is greater than or equal to the storage area required by the picture to be cached which does not have the corresponding cache priority value, caching the picture to be cached which does not have the corresponding cache priority value to the cache area corresponding to the picture to be cached which does not have the corresponding cache priority value;
If the free area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value is smaller than the storage area required by the picture to be buffered which does not have the corresponding buffer priority value, recovering the occupied area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value according to the buffer information of the picture already buffered in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value until the free area in the buffer area corresponding to the picture to be buffered which does not have the corresponding buffer priority value after recovery is larger than or equal to the storage area required by the picture to be buffered which does not have the corresponding buffer priority value, so as to buffer the picture to be buffered through the free area;
and the buffer area corresponding to the buffer priority value and the buffer area corresponding to the picture to be buffered, in which the buffer priority value does not exist, are mutually independent buffer areas.
7. The method of claim 6, wherein determining a buffer priority value corresponding to a picture to be buffered comprises:
Analyzing the buffer priority value from the object code corresponding to the picture to be buffered; and the target code is used for displaying the picture to be cached in the user interface when being executed by the browser.
8. The method according to claim 6, wherein the reclaiming the occupied area from the buffer according to the buffer priority to which the buffered picture in the buffer belongs includes:
determining the lowest target cache priority among the cache priorities to which the cached pictures in the current cache region belong from the corresponding relation between the recorded cached pictures and the cache priorities;
And if the target cache priority is not the preset highest cache priority, recovering a storage area occupied by at least one cached picture under the target cache priority, and if the recovered free area is greater than or equal to the storage area required by the picture to be cached, caching the picture to be cached to the free area in the cache area.
9. The method of claim 8, wherein in the case where the target cache priority is a preset highest cache priority, the method further comprises:
and prohibiting the occupied area from being recycled from the cache area.
10. The method of claim 8, wherein the method further comprises:
When the picture to be cached is cached to the idle area in the cache area, determining the caching priority of the picture to be cached according to the caching priority value of the picture to be cached;
And recording the corresponding relation between the picture to be cached and the caching priority to which the picture to be cached belongs.
CN202010825754.6A 2020-08-17 2020-08-17 File caching method and display device Active CN111966646B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010825754.6A CN111966646B (en) 2020-08-17 2020-08-17 File caching method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010825754.6A CN111966646B (en) 2020-08-17 2020-08-17 File caching method and display device

Publications (2)

Publication Number Publication Date
CN111966646A CN111966646A (en) 2020-11-20
CN111966646B true CN111966646B (en) 2024-05-14

Family

ID=73389455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010825754.6A Active CN111966646B (en) 2020-08-17 2020-08-17 File caching method and display device

Country Status (1)

Country Link
CN (1) CN111966646B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446814A (en) * 2014-09-30 2016-03-30 青岛海信移动通信技术股份有限公司 Cache recovery method and device
CN106844032A (en) * 2017-01-23 2017-06-13 努比亚技术有限公司 The storage processing method and device of a kind of terminal applies
CN109299297A (en) * 2018-07-06 2019-02-01 平安科技(深圳)有限公司 A kind of image cache method for cleaning and terminal device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105446814A (en) * 2014-09-30 2016-03-30 青岛海信移动通信技术股份有限公司 Cache recovery method and device
CN106844032A (en) * 2017-01-23 2017-06-13 努比亚技术有限公司 The storage processing method and device of a kind of terminal applies
CN109299297A (en) * 2018-07-06 2019-02-01 平安科技(深圳)有限公司 A kind of image cache method for cleaning and terminal device

Also Published As

Publication number Publication date
CN111966646A (en) 2020-11-20

Similar Documents

Publication Publication Date Title
CN111949564B (en) Memory exchange method and display device
CN111970549B (en) Menu display method and display device
CN112135180B (en) Content display method and display equipment
CN112165642B (en) Display device
CN111897478A (en) Page display method and display equipment
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN112165641A (en) Display device
CN111031375A (en) Method for skipping detailed page of boot animation and display equipment
CN111954059A (en) Screen saver display method and display device
CN112506859B (en) Method for maintaining hard disk data and display device
CN111984167B (en) Quick naming method and display device
CN112269668A (en) Application resource sharing and display equipment
CN112040340A (en) Resource file acquisition method and display device
CN112235621B (en) Display method and display equipment for visual area
CN112363683B (en) Method and display device for supporting multi-layer display by webpage application
CN112199560B (en) Search method of setting items and display equipment
CN111935530B (en) Display equipment
CN111901649B (en) Video playing method and display equipment
CN111966646B (en) File caching method and display device
CN113971049A (en) Background service management method and display device
CN114079827A (en) Menu display method and display device
CN114390190A (en) Display equipment and method for monitoring application to start camera
CN112231088B (en) Browser process optimization method and display device
CN112291600B (en) Caching method and display device
CN112199612B (en) Bookmark adding and combining method and display equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221021

Address after: 83 Intekte Street, Devon, Netherlands

Applicant after: VIDAA (Netherlands) International Holdings Ltd.

Address before: 266061 room 131, 248 Hong Kong East Road, Laoshan District, Qingdao City, Shandong Province

Applicant before: QINGDAO HISENSE MEDIA NETWORKS Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant