CN105677483B - Data caching method and device - Google Patents

Data caching method and device Download PDF

Info

Publication number
CN105677483B
CN105677483B CN201511033840.9A CN201511033840A CN105677483B CN 105677483 B CN105677483 B CN 105677483B CN 201511033840 A CN201511033840 A CN 201511033840A CN 105677483 B CN105677483 B CN 105677483B
Authority
CN
China
Prior art keywords
cache
capacity
cache pool
memory
pool
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201511033840.9A
Other languages
Chinese (zh)
Other versions
CN105677483A (en
Inventor
赵智宝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TCL Corp
Original Assignee
TCL Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TCL Corp filed Critical TCL Corp
Priority to CN201511033840.9A priority Critical patent/CN105677483B/en
Publication of CN105677483A publication Critical patent/CN105677483A/en
Application granted granted Critical
Publication of CN105677483B publication Critical patent/CN105677483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5011Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals
    • G06F9/5016Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resources being hardware resources other than CPUs, Servers and Terminals the resource being the memory

Abstract

The invention is suitable for the technical field of intelligent equipment, and provides a data caching method and a data caching device, wherein the method comprises the following steps: setting a dynamic cache pool of a network request and default cache capacity of the dynamic cache pool, wherein the dynamic cache pool has a multi-layer cache structure; acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; and when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system. The invention solves the problems of low use efficiency and slow response speed of the system when the resource information acquired through the network request is excessively cached, and effectively improves the response speed of the system to the user operation.

Description

Data caching method and device
Technical Field
The invention belongs to the technical field of intelligent equipment, and particularly relates to a data caching method and device.
Background
With the advent of the mobile internet era, the number of interaction methods and interaction times between the intelligent device serving as the client and the server is increasing. For movie and television type APPs, such as the love art video, resource information of thousands of movies is provided under each category directory, and the resource information includes picture resources (such as movie posters), agreement data (such as conciseness of movies), and the like. The embedded platform cannot download and cache all the resource information. The prior art mainly caches picture resources, but not protocol data. When the resource information is not available, the user cannot scroll the page forward or backward. If the user frequently triggers the network request and the resource information cached by the system is excessive, the use efficiency and the response speed of the system are greatly reduced.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for data caching, so as to solve the problems of low utilization efficiency and low response speed of a system when resource information obtained through a network request is cached too much in the prior art.
In a first aspect, a method for caching data is provided, where the method includes:
setting a dynamic cache pool of a network request and default cache capacity of the dynamic cache pool, wherein the dynamic cache pool has a multi-layer cache structure;
acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system;
and when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system.
In a second aspect, an apparatus for data caching is provided, the apparatus comprising:
the device comprises a setting module, a cache module and a cache module, wherein the setting module is used for setting a dynamic cache pool of a network request and default cache capacity of the dynamic cache pool, and the dynamic cache pool has a multi-layer cache structure;
the acquisition module is used for acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system;
and the adjusting module is used for respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool when the optimal cache capacity is smaller than the default cache capacity so as to optimize the service efficiency of the system.
Compared with the prior art, the invention sets a special dynamic cache pool and the conventional storage capacity thereof aiming at the network request, wherein the dynamic cache pool has a multi-layer cache structure; acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system; therefore, the problems of low use efficiency and low response speed of the system when the resource information acquired through the network request is excessively cached are solved, and the response speed of the system to the user operation is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an implementation of a method for caching data according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a specific implementation of step S102 in the data caching method according to the embodiment of the present invention;
fig. 3 is a structural diagram of a data caching apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention sets a special dynamic cache pool and the conventional storage capacity thereof aiming at the network request, wherein the dynamic cache pool has a multilayer cache structure; acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system; therefore, the problems of low use efficiency and low response speed of the system when the resource information acquired through the network request is excessively cached are solved, and the response speed of the system to the user operation is effectively improved. The embodiment of the invention also provides a corresponding device, which is respectively explained in detail below.
Fig. 1 shows an implementation flow of a data caching method according to an embodiment of the present invention.
In the embodiment of the present invention, the data caching method is applied to intelligent devices, where the intelligent devices include, but are not limited to, smart phones, tablet computers, smart televisions, computers, learning machines, and the like.
Referring to fig. 1, the data caching method includes:
in step S101, a dynamic cache pool requested by a network and a default cache capacity thereof are set, where the dynamic cache pool has a multi-layer cache structure.
In the embodiment of the invention, a special dynamic cache pool is set for a network request, and the default cache capacity of the dynamic cache pool is configured. Here, the dynamic cache pool is used for caching data information downloaded through a network request, such as movie resource information, theme pack resource information, book resource information, and the like, downloaded through the network request. The default cache capacity is the cache capacity of the dynamic cache pool preset by a developer, and the maximum storage capacity of the dynamic cache pool is realized under the condition that the memory space and the hard disk space of the system are enough.
Further, in the embodiment of the present invention, the dynamic cache pool has a multi-layer cache structure, and includes a memory cache pool and a hard disk cache pool, and correspondingly, the default cache capacity includes a default capacity of the memory cache pool and a default capacity of the hard disk cache pool. The memory cache pool is a memory space in the system and comprises a first memory cache pool and a second memory cache pool; the hard disk cache pool is a hard disk space in the system.
Further, the first memory cache pool is used for caching interaction information required by switching between user interface activities.
In the Android system, a lot of jump information needs to be carried when the user interfaces Activity are switched with Activity, however, the value of the Intent transfer of Activity is fixed, and the transfer needs to be completed by copying for many times across processes, so that the memory consumption of the system is increased. The embodiment of the invention can reduce the cross-process transmission by setting the first memory cache pool for temporarily caching the interaction information between the Activity and the Activity, and immediately recover the first memory cache pool after the interaction information is taken out by the Activity for reuse, thereby effectively reducing the memory consumption of the system.
The second memory cache pool is used for caching the analysis result of the protocol data downloaded through the network request.
The hard disk cache pool is used for caching the protocol data downloaded through the network request, and the triggering time information and the effective duration information of the network request.
Here, the second memory cache pool caches the analysis result (i.e., object) of the protocol data downloaded in the current preset historical time period, so that the user can update the page in time during the process of fast rolling and refreshing. The protocol data (text information) downloaded through the network request, the triggering time information, the effective duration information and the like of the network request are cached through the hard disk cache pool, so that when the intelligent equipment is started again after being shut down, the protocol data cached in the hard disk cache pool and downloaded through the network request still cannot be lost, and a user can check the corresponding downloaded data on a page; and when the mobile terminal is started, whether the corresponding protocol data is valid can be judged according to the triggering time and the valid duration information of the network request, and if the protocol data is invalid, the protocol data is deleted. And the effective duration information is configured by the user according to the actual situation.
In step S102, the optimal cache capacity of the dynamic cache pool is obtained according to the operating condition of the system.
Here, the optimal cache capacity of the dynamic cache pool includes an optimal cache capacity of a memory and an optimal cache capacity of a hard disk, where the optimal cache capacity of the memory determines an actually available memory space of the memory cache pool in the current system operating state, and the optimal cache capacity of the hard disk determines an actually available hard disk space of the hard disk cache pool in the current system operating state.
Optionally, fig. 2 shows a specific implementation flow of step S102 provided in the embodiment of the present invention. Referring to fig. 2, the step S102 includes:
in step S201, the remaining available memory space in the system and the first preset ratio are obtained, and a product of the remaining available memory space and the first preset ratio is calculated to obtain an optimal cache capacity of the memory.
In step S202, the remaining available hard disk space in the system and the second preset ratio are obtained, and the product of the remaining available hard disk space and the second preset ratio is calculated to obtain the optimal cache capacity of the hard disk.
Here, the first preset ratio and the second preset ratio are preset by a developer. Illustratively, when the remaining available memory space in the system is 100M and the first predetermined ratio is 0.5, the optimal cache capacity of the memory is 100M × 0.5 — 50M. When the remaining available hard disk space in the system is 500M and the first predetermined ratio is 0.7, the optimal cache capacity of the memory is 500M × 0.7 — 350M.
In step S103, when the optimal cache capacity is smaller than the default cache capacity, the cache capacity of each layer of cache pool in the dynamic cache pool is respectively adjusted to optimize the utilization efficiency of the system.
In the embodiment of the invention, the optimal cache capacity of the memory is compared with the default capacity of the memory cache pool, and the optimal cache capacity of the hard disk is compared with the default capacity of the hard disk cache pool.
When the memory space in the system is insufficient, namely when the optimal cache capacity of the memory is smaller than the default capacity of the memory cache pool, the size of the memory cache pool is adjusted. The step S103 specifically includes:
when the optimal cache capacity of the memory is smaller than the default capacity of the memory cache pool, deleting the interactive information in the first memory cache pool, and/or
And deleting the analysis result of the protocol data in the second memory cache pool so as to reduce the storage capacity of the memory cache pool.
When the space of the hard disk in the system is insufficient, namely when the optimal cache capacity of the hard disk is smaller than the default capacity of the hard disk cache pool, the size of the hard disk cache pool is adjusted. The step S103 further includes:
and when the optimal cache capacity of the hard disk is smaller than the default capacity of the hard disk cache pool, deleting the protocol data, the triggering time information of the network request and the effective duration information in the hard disk cache pool so as to reduce the storage capacity of the hard disk cache pool.
Here, when the memory space in the system is insufficient, the interactive information stored in the first memory cache pool is preferentially deleted partially or completely to reduce the storage capacity of the first memory cache pool, so that the system reclaims the vacant memory space in the first memory cache pool. Optionally, when the deletion is performed, the interaction information that needs to be deleted is determined by combining the LRU algorithm, the validity duration information, and the like. Similarly, the analysis result of the protocol data in the second memory cache pool may be locally deleted to reduce the storage capacity of the second memory cache pool, so that the system recovers the vacant memory space in the second memory cache pool. When the hard disk space in the system is insufficient, the protocol data, the triggering time information and the effective duration information of the network request in the hard disk cache pool are locally deleted, and only the protocol data, the triggering time information and the effective duration information of the network request with the closer request time are cached, so that the storage capacity of the hard disk cache pool is reduced, and the system recovers the vacant hard disk space in the hard disk cache pool. Therefore, the problems of low use efficiency and low response speed of the system when the user frequently triggers network requests and the resource information cached by the system is excessive are solved, the problems of low use efficiency and low response speed of the system when the resource information is cached excessively are effectively solved, and the response speed of the system to the user operation is improved.
The invention sets a special dynamic cache pool and the conventional storage capacity thereof aiming at the network request, wherein the dynamic cache pool has a multilayer cache structure; acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system; therefore, the problems of low use efficiency and low response speed of the system when the resource information acquired through the network request is excessively cached are solved, and the response speed of the system to the user operation is effectively improved.
Fig. 3 shows a component structure of the apparatus for data caching according to the embodiment of the present invention, and for convenience of description, only the parts related to the embodiment of the present invention are shown.
In an embodiment of the present invention, the apparatus is used to implement the data caching method in the above-described embodiment of fig. 1 or fig. 2, and may be a software unit, a hardware unit, or a unit combining software and hardware that is built in an intelligent device. The smart device includes but is not limited to a smart phone, a tablet computer, a smart television, a smart watch, a learning machine, and the like.
Referring to fig. 3, the data caching apparatus includes:
a setting module 31, configured to set a dynamic cache pool of a network request and a default cache capacity thereof, where the dynamic cache pool has a multi-layer cache structure;
an obtaining module 32, configured to obtain an optimal cache capacity of the dynamic cache pool according to an operating condition of the system;
an adjusting module 33, configured to respectively adjust the cache capacity of each layer of cache pool in the dynamic cache pool when the optimal cache capacity is smaller than the default cache capacity, so as to optimize the utilization efficiency of the system.
Further, the dynamic cache pool comprises a memory cache pool and a hard disk cache pool; the default cache capacity comprises the default capacity of a memory cache pool and the default capacity of a hard disk cache pool;
the memory cache pool is a memory space in the system and comprises a first memory cache pool and a second memory cache pool; the hard disk cache pool is a hard disk space in the system;
the first memory cache pool is used for caching the interaction information required by switching between the user interface Activities;
the second memory cache pool is used for caching the analysis result of the protocol data downloaded by the network request;
the hard disk cache pool is used for caching the protocol data downloaded through the network request, and the triggering time information and the effective duration information of the network request.
Further, the optimal cache capacity includes an optimal cache capacity of a memory and an optimal cache capacity of a hard disk.
The acquisition module 32 includes:
a first obtaining unit 321, configured to obtain a remaining available memory space in the system and a first preset ratio, and calculate a product of the remaining available memory space and the first preset ratio to obtain an optimal cache capacity of the memory;
the second obtaining unit 322 is configured to obtain the remaining available hard disk space in the system and a second preset ratio, and calculate a product of the remaining available hard disk space and the second preset ratio to obtain an optimal cache capacity of the hard disk.
Further, the adjusting module 33 includes:
a first adjusting unit 331, configured to delete the interaction information in the first memory cache pool and/or delete the interaction information in the first memory cache pool when the optimal cache capacity of the memory is smaller than the default capacity of the memory cache pool
And deleting the analysis result of the protocol data in the second memory cache pool so as to reduce the storage capacity of the memory cache pool.
Further, the adjusting module 33 further includes:
a second adjusting unit 332, configured to delete the protocol data, the trigger time information of the network request, and the valid duration information in the hard disk cache pool when the optimal cache capacity of the hard disk is smaller than the default capacity of the hard disk cache pool, so as to reduce the storage capacity of the hard disk cache pool.
It should be noted that the apparatus in the embodiment of the present invention may be configured to implement all technical solutions in the foregoing method embodiments, and the functions of each functional module may be implemented specifically according to the method in the foregoing method embodiments, and the specific implementation process may refer to the relevant description in the foregoing example, which is not described herein again.
The invention sets a special dynamic cache pool and the conventional storage capacity thereof aiming at the network request, wherein the dynamic cache pool has a multilayer cache structure; acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system; therefore, the problems of low use efficiency and low response speed of the system when the resource information acquired through the network request is excessively cached are solved, and the response speed of the system to the user operation is effectively improved.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed method and apparatus for data caching may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units and modules in the embodiments of the present invention may be integrated into one processing unit, or each unit and module may exist alone physically, or two or more units and modules may be integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. A method for caching data, the caching method comprising:
setting a dynamic cache pool of a network request and default cache capacity of the dynamic cache pool, wherein the dynamic cache pool has a multi-layer cache structure and is used for caching data information downloaded through the network request; the dynamic cache pool comprises a memory cache pool and a hard disk cache pool; the default cache capacity comprises the default capacity of a memory cache pool and the default capacity of a hard disk cache pool; the memory cache pool is a memory space in the system; the hard disk cache pool is a hard disk space in the system; the default cache capacity is the maximum storage capacity of the dynamic cache pool under the condition that the memory space and the hard disk space of the system are enough;
acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system, wherein the method comprises the following steps: obtaining the remaining available memory space and a first preset proportion in the system, and calculating the product of the remaining available memory space and the first preset proportion to obtain the optimal cache capacity of the memory; obtaining the remaining available hard disk space and a second preset proportion in the system, and calculating the product of the remaining available hard disk space and the second preset proportion to obtain the optimal cache capacity of the hard disk; the optimal cache capacity comprises the optimal cache capacity of a memory and the optimal cache capacity of a hard disk;
and when the optimal cache capacity is smaller than the default cache capacity, respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the use efficiency of the system.
2. The method of data caching according to claim 1, wherein the memory cache pools comprise a first memory cache pool and a second memory cache pool;
the first memory cache pool is used for caching the interaction information required by switching between the user interface Activities;
the second memory cache pool is used for caching the analysis result of the protocol data downloaded by the network request;
the hard disk cache pool is used for caching the protocol data downloaded through the network request, and the triggering time information and the effective duration information of the network request.
3. The method for data caching according to claim 2, wherein the adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the utilization efficiency of the system when the optimal cache capacity is smaller than the default cache capacity comprises:
when the optimal cache capacity of the memory is smaller than the default capacity of the memory cache pool, deleting the interactive information in the first memory cache pool, and/or
And deleting the analysis result of the protocol data in the second memory cache pool so as to reduce the storage capacity of the memory cache pool.
4. The method for data caching according to claim 1, wherein the adjusting the cache capacity of each layer of cache pool in the dynamic cache pool to optimize the utilization efficiency of the system when the optimal cache capacity is smaller than the default cache capacity comprises:
and when the optimal cache capacity of the hard disk is smaller than the default capacity of the hard disk cache pool, deleting the protocol data, the triggering time information of the network request and the effective duration information in the hard disk cache pool so as to reduce the storage capacity of the hard disk cache pool.
5. An apparatus for data caching, the apparatus comprising:
the device comprises a setting module, a data processing module and a processing module, wherein the setting module is used for setting a dynamic cache pool of a network request and default cache capacity of the dynamic cache pool, the dynamic cache pool has a multi-layer cache structure, and the dynamic cache pool is used for caching data information downloaded through the network request; the dynamic cache pool comprises a memory cache pool and a hard disk cache pool; the default cache capacity comprises the default capacity of a memory cache pool and the default capacity of a hard disk cache pool; the memory cache pool is a memory space in the system; the hard disk cache pool is a hard disk space in the system; the default cache capacity is the maximum storage capacity of the dynamic cache pool under the condition that the memory space and the hard disk space of the system are enough;
the acquisition module is used for acquiring the optimal cache capacity of the dynamic cache pool according to the running condition of the system; the optimal cache capacity comprises the optimal cache capacity of a memory and the optimal cache capacity of a hard disk; the acquisition module includes: the first obtaining unit is used for obtaining the remaining available memory space and a first preset proportion in the system, and calculating the product of the remaining available memory space and the first preset proportion to obtain the optimal cache capacity of the memory; the second acquisition unit is used for acquiring the remaining available hard disk space and a second preset proportion in the system, and calculating the product of the remaining available hard disk space and the second preset proportion to obtain the optimal cache capacity of the hard disk;
and the adjusting module is used for respectively adjusting the cache capacity of each layer of cache pool in the dynamic cache pool when the optimal cache capacity is smaller than the default cache capacity so as to optimize the service efficiency of the system.
6. The apparatus for data caching according to claim 5, wherein the memory cache pools comprise a first memory cache pool and a second memory cache pool;
the first memory cache pool is used for caching the interaction information required by switching between the user interface Activities;
the second memory cache pool is used for caching the analysis result of the protocol data downloaded by the network request;
the hard disk cache pool is used for caching the protocol data downloaded through the network request, and the triggering time information and the effective duration information of the network request.
7. The apparatus for data caching of claim 6, wherein the adjustment module comprises:
a first adjusting unit, configured to delete the interaction information in the first memory cache pool and/or delete the interaction information in the first memory cache pool when the optimal cache capacity of the memory is smaller than the default capacity of the memory cache pool
And deleting the analysis result of the protocol data in the second memory cache pool so as to reduce the storage capacity of the memory cache pool.
8. The apparatus for data caching of claim 5, wherein the adjustment module comprises:
and the second adjusting unit is used for deleting the protocol data, the triggering time information of the network request and the effective duration information in the hard disk cache pool when the optimal cache capacity of the hard disk is smaller than the default capacity of the hard disk cache pool so as to reduce the storage capacity of the hard disk cache pool.
CN201511033840.9A 2015-12-31 2015-12-31 Data caching method and device Active CN105677483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201511033840.9A CN105677483B (en) 2015-12-31 2015-12-31 Data caching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201511033840.9A CN105677483B (en) 2015-12-31 2015-12-31 Data caching method and device

Publications (2)

Publication Number Publication Date
CN105677483A CN105677483A (en) 2016-06-15
CN105677483B true CN105677483B (en) 2020-01-24

Family

ID=56190045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201511033840.9A Active CN105677483B (en) 2015-12-31 2015-12-31 Data caching method and device

Country Status (1)

Country Link
CN (1) CN105677483B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108681469B (en) * 2018-05-03 2021-07-30 武汉斗鱼网络科技有限公司 Page caching method, device, equipment and storage medium based on Android system
CN110533176B (en) * 2018-05-25 2022-10-11 赛灵思电子科技(北京)有限公司 Caching device for neural network computation and related computing platform thereof
CN112667588B (en) * 2019-10-16 2022-12-02 青岛海信移动通信技术股份有限公司 Intelligent terminal device and method for writing file system data
CN111107438B (en) * 2019-12-30 2022-04-22 北京奇艺世纪科技有限公司 Video loading method and device and electronic equipment
CN116009763A (en) * 2021-10-22 2023-04-25 华为技术有限公司 Storage method, device, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5784698A (en) * 1995-12-05 1998-07-21 International Business Machines Corporation Dynamic memory allocation that enalbes efficient use of buffer pool memory segments
KR20010000208A (en) * 2000-08-17 2001-01-05 음용기 Method and system for large image display
CN102640472A (en) * 2009-12-14 2012-08-15 瑞典爱立信有限公司 Dynamic cache selection method and system
CN103246613A (en) * 2012-02-08 2013-08-14 联发科技(新加坡)私人有限公司 Cache device and cache data acquiring method therefor
CN103279429A (en) * 2013-05-24 2013-09-04 浪潮电子信息产业股份有限公司 Application-aware distributed global shared cache partition method
CN103907097A (en) * 2011-09-30 2014-07-02 美国网域存储技术有限公司 Intelligence for controlling virtual storage appliance storage allocation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5784698A (en) * 1995-12-05 1998-07-21 International Business Machines Corporation Dynamic memory allocation that enalbes efficient use of buffer pool memory segments
KR20010000208A (en) * 2000-08-17 2001-01-05 음용기 Method and system for large image display
CN102640472A (en) * 2009-12-14 2012-08-15 瑞典爱立信有限公司 Dynamic cache selection method and system
CN103907097A (en) * 2011-09-30 2014-07-02 美国网域存储技术有限公司 Intelligence for controlling virtual storage appliance storage allocation
CN103246613A (en) * 2012-02-08 2013-08-14 联发科技(新加坡)私人有限公司 Cache device and cache data acquiring method therefor
CN103279429A (en) * 2013-05-24 2013-09-04 浪潮电子信息产业股份有限公司 Application-aware distributed global shared cache partition method

Also Published As

Publication number Publication date
CN105677483A (en) 2016-06-15

Similar Documents

Publication Publication Date Title
CN105677483B (en) Data caching method and device
US10698559B2 (en) Method and apparatus for displaying content on same screen, and terminal device
US20200328984A1 (en) Method and apparatus for allocating resource
US9201810B2 (en) Memory page eviction priority in mobile computing devices
RU2627222C2 (en) Power efficient content transfer over wireless connection
US10862992B2 (en) Resource cache management method and system and apparatus
WO2017185616A1 (en) File storage method and electronic equipment
CN106453572B (en) Method and system based on Cloud Server synchronous images
CN103888934B (en) A kind of mobile terminal cache management device and management method
KR102402780B1 (en) Apparatus and method for managing memory
CN104808952A (en) Data caching method and device
CN109361947A (en) Internet resources batch loading method, smart television, storage medium and device
CN103618962A (en) Control method and device for getting access to specific video application of smart television
CN108512768B (en) Access amount control method and device
CN108471385B (en) Flow control method and device for distributed system
CN103677519A (en) Method for collecting multimedia resource, terminal and server
TWI602431B (en) Method and device for transmitting information
CN104199729A (en) Resource management method and system
CN106933702A (en) A kind of method of intelligent terminal storage space management, device and intelligent terminal
US9787755B2 (en) Method and device for browsing network data, and storage medium
CN104967770A (en) Video shooting method and apparatus thereof
CN105843752A (en) Method and device for distributing memory of mobile terminal for caching image data
US10002589B2 (en) Retaining user selected screen area on user equipment
CN106855829A (en) A kind of method and device of exented memory
CN110020290B (en) Webpage resource caching method and device, storage medium and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant