CN104808952A - Data caching method and device - Google Patents

Data caching method and device Download PDF

Info

Publication number
CN104808952A
CN104808952A CN201510223929.5A CN201510223929A CN104808952A CN 104808952 A CN104808952 A CN 104808952A CN 201510223929 A CN201510223929 A CN 201510223929A CN 104808952 A CN104808952 A CN 104808952A
Authority
CN
China
Prior art keywords
data
buffer memory
memory
visited
application program
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510223929.5A
Other languages
Chinese (zh)
Other versions
CN104808952B (en
Inventor
韩叙东
刘思音
雷志海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201510223929.5A priority Critical patent/CN104808952B/en
Publication of CN104808952A publication Critical patent/CN104808952A/en
Application granted granted Critical
Publication of CN104808952B publication Critical patent/CN104808952B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The embodiment of the invention discloses a data caching method and a data caching device, wherein the method comprises the following steps: loading to-be-visited data of an application program to a cache and visiting the cache; when the visit of the data is finished, retaining the data in the cache; monitoring an attribute of the cache and/or the attribute of the data and clearing the data in the cache according to a set rule. According to the technical scheme provided by the embodiment of the invention, the speed for visiting the data subsequently can be speeded up and the memory usage can be reduced.

Description

Data cache method and device
Technical field
The embodiment of the present invention relates to field of computer technology, particularly relates to data cache method and device.
Background technology
Existing application program is being carried out in local picture loading procedure, is usually directed to following three kinds of load modes:
First kind of way [UIImage imageNamed :] is loaded into by local picture in Installed System Memory and carries out buffer memory and show, when application program exits, removes buffer memory;
The second way [UIImage imageWithContentsOfFile :] is, is directly read as file by local picture and shows, not buffer memory, namely directly Load Image, not buffer memory, still directly reads this picture file and shows, not buffer memory when again loading this picture after waiting;
The third mode [UIImage imageWithData :] is, be first data layout by the format conversion of local picture, carry out again loading and showing, which does not also perform caching, Deng after again load this picture time still first picture format is converted to data layout and carries out again loading and showing, not buffer memory.
The load mode of existing application program to network picture is: downloaded in Installed System Memory by network image data, and this picture of loaded and displayed; When no longer holding this picture internal memory, this block internal memory can by system recoveries; After being recovered, when again accessing this network picture, understand again download network image data and this network picture of loaded and displayed.
But all there is certain drawback in above-mentioned three kinds of load modes to local picture.Concrete, for first kind of way, when local picture accessed again by needs, although access speed can be accelerated by the mode directly reading buffer memory, take Installed System Memory all the time; For the second way and the third mode, although less to taking of Installed System Memory, can releasing memory in time, speed when again accessing local picture is slower, especially the third mode needs to consume extra internal memory to carry out format conversion, and the extra format conversion time.In addition, access time longer problem is also existed again to the load mode of network picture.
Summary of the invention
The embodiment of the present invention provides a kind of data cache method and device, to accelerate the follow-up access time to data, reduces EMS memory occupation.
On the one hand, embodiments provide a kind of data cache method, the method comprises:
By the Data import to be visited of application program to buffer memory, and conduct interviews;
When terminating the access to described data, in described buffer memory, retain described data;
Monitor the attribute of described buffer memory and/or the attribute of described data, and according to setting rule, the data in described buffer memory are removed.
On the other hand, the embodiment of the present invention additionally provides a kind of data buffer storage device, and this device comprises:
Data buffer storage and addressed location, for by the Data import to be visited of application program to buffer memory, and to conduct interviews;
Data stick unit, for when terminating the access to described data, retains described data in described buffer memory;
Data dump unit, for the attribute of the attribute and/or described data of monitoring described buffer memory, and removes the data in described buffer memory according to setting rule.
The technical scheme that the embodiment of the present invention provides, these data are not removed immediately after end data access, but continue in the buffer to retain these data, by the time follow-up when again accessing these data, directly from buffer memory, read these data, thus accelerate data access time, also can carry out data cached removing based on setting strategy simultaneously, can avoid like this continuing to take too much memory headroom.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of a kind of data cache method that the embodiment of the present invention one provides;
Fig. 2 is the EMS memory occupation distribution schematic diagram loading 100 different pictures respectively with three kinds of local picture load modes of the prior art;
Fig. 3 is the Annual distribution schematic diagram of accessing under above-mentioned three kinds of load modes respectively shared by 100 same pictures;
Fig. 4 is the schematic flow sheet of a kind of image cache method that the embodiment of the present invention three provides;
Fig. 5 is the structural representation of the data buffer storage device that the embodiment of the present invention four provides.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not entire infrastructure.
Before in further detail exemplary embodiment being discussed, it should be mentioned that some exemplary embodiments are described as the process or method described as process flow diagram.Although operations (or step) is described as the process of order by process flow diagram, many operations wherein can be implemented concurrently, concomitantly or simultaneously.In addition, the order of operations can be rearranged.Described process can be terminated when its operations are completed, but can also have the additional step do not comprised in the accompanying drawings.Described process can correspond to method, function, code, subroutine, subroutine etc.
Also it should be mentioned that and to replace in implementation at some, the function/action mentioned can according to being different from occurring in sequence of indicating in accompanying drawing.For example, depend on involved function/action, in fact each width figure in succession illustrated can perform simultaneously or sometimes can perform according to contrary order substantially.
Embodiment one
Fig. 1 is the schematic flow sheet of a kind of data cache method that the embodiment of the present invention one provides.The present embodiment is applicable to the situation that the data of accessing the terminal device of such as smart mobile phone, panel computer, notebook computer, desktop computer or personal digital assistant and so on carry out cache management, to accelerate the follow-up data access time, reduce taking internal memory.The method can be performed by data buffer storage device, and described device, by software simulating, is built on terminal device.See Fig. 1, the data cache method that the present embodiment provides specifically comprises the steps:
Step S110, by the Data import to be visited of application program to buffer memory, and to conduct interviews.
Step S120, when terminating access to data, retain data in the buffer.
Wherein, the data that will obtain that data to be visited are determined according to current accessed demand for terminal device can be such as the data that the application program on terminal device will be accessed when being in this running status of foreground mode.These data are preferably picture, can certainly be other data of such as audio frequency or video and so on.Further, data to be visited are at least one in the following two kinds data: be stored in the local data in local disk, and are stored in the network data in internet on server.
In the prior art, after getting data access request, one class loading scheme is to buffer memory by Data import to be visited, to shorten next data access time, but the program is not administered and maintained buffer memory, just in buffer memory, increase data simply, thus it is more and more to cause the Installed System Memory of terminal device to take, and brings the problem that internal memory is critical; Another kind of loading scheme then not buffer memory data to be visited, although it is so can not committed memory, but when follow-up data access each time, all need to reload data, access time is longer, and when data to be visited are network data, the phenomenon that flow explodes easily occurs.
For this reason, the present embodiment proposes a kind of new improvement project: first data to be visited are loaded on buffer memory from local disk or network, conduct interviews, and afterwards when terminating this access, does not remove these data, but continues to retain these data; Meanwhile, in real time data management is carried out to buffer memory, to dispose some data wherein, reduce memory pressure.The present embodiment can solve in prior art to take into account simultaneously and reduce because of taking and shortening the drawback of same data repeatedly being accessed to the shared time internal memory during loading different pieces of information.
Buffer memory can be memory cache and/or disk buffering.Wherein, memory cache is specifically designed to the amount of physical memory storing data to be visited for a piece of distributing in internal memory; Equally, disk buffering is specifically designed to the amount of physical memory storing data to be visited for a piece of distributing on terminal device local disk.Preferably, by belonging to the Data import to be visited of local data to memory cache, the data to be visited belonging to network data can be loaded in memory cache and disk buffering simultaneously.Concrete, two cache pools can being created in memory cache: local data memory cache pond (for safeguarding local data) and network data memory cache pond (for maintaining network data), in local disk buffer memory, creating a network data disk buffering pond (for maintaining network data).
In a kind of embodiment of the present embodiment, data to be visited are local data, accordingly, by Data import to be visited to buffer memory, and conduct interviews, comprising: if get the data access request of application program, then from the memory cache of local system, search data to be visited; If search unsuccessfully, load data to be visited to memory cache from local disk and store, and conducted interviews by application program.
In the another kind of embodiment of the present embodiment, data to be visited are network data, accordingly, by Data import to be visited to buffer memory, and conduct interviews, comprising: if get the data access request of application program, then from the memory cache of local system, search data to be visited; If search unsuccessfully, then from the disk buffering of application program, search data to be visited; If search unsuccessfully, then load data to be visited from network and store to memory cache and disk buffering, and conducted interviews by application program.
Preferably, if search successfully, then directly these data are directly accessed by application program.
Step S130, the monitoring attribute of buffer memory and/or the attribute of data, and according to setting rule, the data in buffer memory are removed.
Wherein, the attribute of buffer memory can be the memory space of data in buffer memory, or terminal device is to the read or write speed of buffer memory.Namely the attribute of data can be the kind of data, size, visitation frequency (access times in the unit interval), access time, search successful number of times (number of success searched in abbreviation, also hit-count) etc. in the buffer.In the present embodiment, the monitoring of cache attribute can be considered to be a kind of detection trigger of data dump.Exemplary, when the memory space of data exceedes the capacity threshold of setting in the buffer, or, when the read or write speed of terminal device to buffer memory is slower than the speed threshold value of setting, according to setting rule, the data in buffer memory are removed.
Certainly, also the attribute of buffer memory can not be considered, only based on the attribute of data, according to setting rule, the data in buffer memory are removed, such as real-time to data search number of success and the access time monitors, as long as this number of success is lower than the number of times threshold value of setting, or the last access time distance current time is greater than the time interval of setting, just performs clear operation to these data.Again such as, monitor to the visitation frequency of data lower than setting frequency threshold value after, remove this data.
The technical scheme that the present embodiment provides, these data are not removed immediately after end data access, but continue in the buffer to retain these data, by the time follow-up when again accessing these data, directly from buffer memory, read these data, thus accelerate data access time, also can carry out data cached removing based on setting strategy simultaneously, can avoid like this continuing to take too much memory headroom.
Embodiment two
The present embodiment, on the basis of above-described embodiment one, does further optimization to step S130, carrying out reasonably removing in time to buffer memory, reduces EMS memory occupation.
In the present embodiment, step S130 is preferably: the memory space of data in monitoring buffer memory, if memory space reaches setting threshold value, then removes the data in buffer memory based on setting rule.Wherein, setting threshold value can be relevant with the factor such as the memory capacity size of terminal device, processor performance, concrete scene demand (such as having the application program kind of data access demand), can be pre-set by developer, or, dynamically set by user in actual application.
Exemplary, the data cache method that the embodiment of the present invention provides, also comprises: if search data to be visited from buffer memory, then upgrade the access time for the treatment of visit data and search number of success.Concrete, after often searching once data to be visited success, just the number of success of searching of these data to be visited added up to add 1, the access time is updated to this and searches the time.Then based on setting rule, the data in buffer memory being removed, comprising: according to searching number of success and access time, the data in buffer memory are removed.For example, the data meeting following condition in buffer memory can be removed: search the number of times threshold value of number of success lower than setting; And/or the last access time distance current time is greater than the time interval of setting.
For the data cached cleaning scheme that clearer statement the present embodiment provides, now illustrate.Such as, before clear operation, the data message stored in buffer memory is as shown in table 1 below:
Table 1
Data The last access time Search number of success Memory space in the buffer
Local data A 2016.1.1 day 8:00 7 2.7M
Network data B 2016.1.1 day 9:03 10 0.5M
Network data C 2016.1.1 day 9:06 5 1.5M
Local data D 2016.1.1 day 10:21 15 1M
Local data E 2016.1.1 day 10:23 2 2M
Network data F 2016.1.1 day 11:30 8 0.3M
If current time is 2016.1.1 day 12:00, the number of times threshold value of setting is 6, the time interval of setting is 3 hours, then can by wherein search number of success lower than 6 network data C and local data E remove, the local data A wherein the last access time distance current time being greater than 3 hours removes.The data message stored in buffer memory after this clear operation is as shown in table 2 below:
Table 2
Data The last access time Search number of success Memory space in the buffer
Network data B 2016.1.1 day 9:03 10 0.5M
Local data D 2016.1.1 day 10:21 15 1M
Network data F 2016.1.1 day 11:30 8 0.3M
From table 1 and table 2, before and after removing, in buffer memory, the memory space of data is respectively: 2.7+0.5+1.5+1+2+0.3=8M (before removing), 0.5+1+0.3=1.8M (after removing).After performing data-cleaning operation to buffer memory, in buffer memory, the memory space of data decreases 6.2M, and the memory space reduced can be reserved for other use by terminal device.
Adopt the data cached cleaning scheme that above-mentioned example provides, can eliminate long-time not accessed in buffer memory in time and search the less data of hit-count, the benefit done like this is: can ensure the data retained in buffer memory be always frequent use and the valid data that hit-count is more, the follow-up access time to these valid data can be accelerated.
Certainly, those of ordinary skill in the art should be understood that and also remove the data in buffer memory based on setting rule by other modes, such as, according to setting cycle (as 10 minutes), remove the data stored in buffer memory.Because which does not relate to the monitoring of data attribute, so can removing speed be accelerated, timely releasing memory.
In embodiments of the present invention, under this application scenarios of data will accessed when being in this running status of foreground mode for application program in data to be visited, caching method also can comprise:
Getting after internal memory warning prompt or application program exit foreground mode, empty the data stored in buffer memory.
Embodiment three
Along with the develop rapidly of mobile Internet, mobile terminal is applied the internal memory brought and to be critical and the flow problem that explodes more and more becomes the pain spot of user, and wherein because to load and internal memory that access picture causes and problems of liquid flow become the most important thing.
Illustrate that the modes of three kinds of local pictures of loading that above-mentioned background technology part is mentioned are in EMS memory occupation and the different manifestations in the access time respectively with two experiments below.
First to load 100 different pictures in variable dictionary NSMutableDictionary.Under original state (when not Loading Image), it is 3M that Installed System Memory uses.When loading 100 pictures (same pictures, name is different, form is jpg) after, the internal memory of the first kind of way (mode 1) described in background technology and the second way (mode 2) uses and all rises to 8M, and average every pictures increases internal memory 50K; The internal memory of the third mode (mode 3) uses and rises to 10M, and average every pictures increases internal memory and uses 70K.When removing all pictures in NSMutableDictionary (also namely terminating the access to these pictures): the internal memory of first kind of way uses and do not reduce; The second way and the third mode, owing to all not having buffer memory picture, so internal memory use can reduce, concrete, the EMS memory occupation of the second way reduces to 4.5M, and the EMS memory occupation of the third mode reduces to 4M.Specifically as shown in Figure 2.
It can thus be appreciated that when adopting first kind of way to Load Image, system meeting buffer memory picture is in internal memory, and internal memory can not reduce along with removing of object picture.When needing in application program to use a large amount of picture, first kind of way can cause rising suddenly and sharply of EMS memory occupation.The EMS memory occupation of the second way and the third mode can reduce along with removing of object picture.But extra internal memory can be caused when the third mode Loads Image to increase.
And then attempt access 100 same pictures, the T.T. that record access takies.As shown in Figure 3.Because system is to the buffer memory of picture, the access time of first kind of way is far less than the second way and the third mode (classification 1 see in Fig. 3).If (directly access the picture of buffer memory after namely hitting buffer memory after application program inside increase by is for the buffer memory of this picture, otherwise still use former mode to access picture), the access time of three kinds of modes all greatly reduces, and nearly all reaches 0.01s (classification 2 see in Fig. 3).
In view of this, the present embodiment, based on above-mentioned all embodiments, is this concrete scene of picture for data, provides a preferred embodiment.The present embodiment is applicable to the situation that the picture of accessing the application program be provided with on the terminal device such as smart mobile phone or panel computer of IOS system carries out cache management, to accelerate the picture access time, reduces taking internal memory.The method can be performed by image cache device, and described device, by software simulating, can be used as a part for application program, or independent of application program, is built on terminal device.
The scheme that the present embodiment provides: utilize cache management mechanism, shortens access time of repeating to Load Image and decreases flow waste; For local picture and network picture for the different requirements of buffer memory, cache policy flexibly can be worked out; When extra buffer memory causes memory pressure, eliminate according to strategy and do not use and access less image cache at most.The program is mobile terminal application characteristic service, and customizable flexibly, unified management image cache, solves the mobile terminal application memory perplexing user for a long time and take and problems of liquid flow, improve Consumer's Experience.
In the present embodiment, picture is divided into local picture and network picture two kinds of pictures.For the buffer memory of the local picture of unified management and network picture, adjust cache policy flexibly, the present embodiment adopts different cache policies for above-mentioned two kinds of pictures respectively.
See Fig. 4, the image cache method that the present embodiment provides specifically comprises the steps:
Step S410, get the picture access request of application program.
Step S420, judge whether picture to be visited is local picture.If so, perform step S430a-S470a, otherwise perform step S430b-S470b.
Wherein, step S430a-S470a is the specific strategy of local image cache scheme.In this scenario, in memory cache, create local picture memory cache pond, safeguard there are three dictionary objects in this cache pool, be local picture dictionary, access time dictionary respectively and search number of success dictionary.At execution of step S410, and judge that picture to be visited is after local picture, perform step S430a-S470a.
Step S430a, from local picture memory cache pond, search whether there is picture to be visited.If so, perform step S450a, otherwise perform step S440a.
Step S440a, load picture to be visited from local disk and store to local picture memory cache pond.Wherein, load mode can be the second way described in background technology part.Perform step S450a.
Step S450a, to upgrade in local picture memory cache pond the access time of picture to be visited and search number of success.
Step S460a, judge whether the memory space of data in local picture memory cache pond reaches the first setting threshold value (such as 5M).If so, then perform step S470a, otherwise perform step S480.
Step S470a, eliminate corresponding object picture in local picture memory cache pond according to the first strategy, until in this cache pool the memory space of data is at the first safety value (such as 1M) below.Perform step S480.
Wherein, the first strategy is: search number of success in preferential superseded local picture memory cache pond in local picture dictionary and lack and access time object picture comparatively early.Concrete superseded method, the associated description removed the data in buffer memory that can provide see the embodiment of the present invention one and embodiment two, does not repeat them here.
Getting after internal memory warning prompt or application program exit foreground mode, empty all object pictures in local picture memory cache pond.
Step S430b-S470b is the specific strategy of network picture buffering scheme.In this scenario, in memory cache, create network picture memory cache pond, in this cache pool, safeguard there is a network picture dictionary object.Meanwhile, in the disk buffering of application program, network picture disk buffering pond is created.At execution of step S410, and judge that picture to be visited is not after local picture, determine that this picture is network picture, perform step S430b-S470b.
Step S430b, from network picture memory cache pond, search whether there is picture to be visited.If so, perform step S480, otherwise perform step S440b.
Step S440b, from network picture disk buffering pond, search whether there is picture to be visited.If so, perform step S480, otherwise perform step S450b.
Step S450b, load picture to be visited from network, and be stored to network picture memory cache pond and disk buffering pond simultaneously.
Step S460b, judge whether the memory space of data in network picture memory cache pond reaches the second setting threshold value (such as 10M).If so, then perform step S470b, otherwise perform step S480.
Step S470b, eliminate corresponding object picture in network picture memory cache pond according to the second strategy, until in this cache pool the memory space of data is at the second safety value (such as 2M) below.Perform step S480.
Wherein, the second strategy is: eliminate object pictures all in network picture memory cache pond.Preferably, regularly (as three days) network picture disk buffering pond can be emptied.
Getting after internal memory warning prompt or application program exit foreground mode, empty all object pictures in network picture memory cache pond.
Step S480, return picture to be visited, conduct interviews for application program, terminate.
The technical scheme that the present embodiment provides, possesses following advantage:
1, manage the buffer memory of local picture and network picture simultaneously, and cache policy can be adjusted flexibly.
2, for local picture, safeguard the memory cache of local picture, substantially reduce the access time of local picture, and corresponding cache object can be eliminated according to different opportunitys, thus reduce EMS memory occupation.
3, for network picture, the memory cache of maintaining network picture and disk buffering, substantially reduce the access time of network picture, save flow, and can eliminate corresponding cache object according to different opportunitys, thus reduce EMS memory occupation.
4, different replacement policies is devised according to different picture/mb-type.The replacement policy of local image cache considers access time and hit-count, and preferential eliminating does not use at most and hit less buffer memory.
Embodiment four
Fig. 5 is the structural representation of the data buffer storage device that the embodiment of the present invention four provides.See Fig. 5, the concrete structure of this device is as follows:
Data buffer storage and addressed location 510, for by the Data import to be visited of application program to buffer memory, and to conduct interviews;
Data stick unit 520, for when terminating the access to described data, retains described data in described buffer memory;
Data dump unit 530, for the attribute of the attribute and/or described data of monitoring described buffer memory, and removes the data in described buffer memory according to setting rule.
In a kind of embodiment of the present embodiment, described data buffer storage and addressed location 510, specifically for:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, load described data to be visited from local disk and store to described memory cache, and conducted interviews by described application program.
In the another kind of embodiment of the present embodiment, described data buffer storage and addressed location 510, specifically for:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, then from the disk buffering of described application program, search data to be visited;
If search unsuccessfully, then load described data to be visited from network and store to described memory cache and disk buffering, and conducted interviews by described application program.
Exemplary, described data dump unit 530, specifically for: the memory space of monitoring data in described buffer memory, if described memory space reaches setting threshold value, then based on setting rule, the data in described buffer memory are removed.
Exemplary, on the basis of technique scheme, the device that the present embodiment provides also comprises:
Updating block 540, if search data to be visited for described data buffer storage and addressed location 510 from described buffer memory, then upgrades access time of described data to be visited and searches number of success;
Described data dump unit 530, specifically for:
According to searching number of success and access time, the data in described buffer memory are removed.
Exemplary, data dump unit 530 specifically for:
According to setting cycle, the data stored in described buffer memory are removed.
Exemplary, data dump unit 530, also for: getting after internal memory warning prompt or described application program exit foreground mode, empty the data stored in described buffer memory.
On the basis of technique scheme, described data to be visited are image data, voice data or video data.
The said goods can perform the method that any embodiment of the present invention provides, and possesses the corresponding functional module of manner of execution and beneficial effect.
Note, above are only preferred embodiment of the present invention and institute's application technology principle.Skilled person in the art will appreciate that and the invention is not restricted to specific embodiment described here, various obvious change can be carried out for a person skilled in the art, readjust and substitute and can not protection scope of the present invention be departed from.Therefore, although be described in further detail invention has been by above embodiment, the present invention is not limited only to above embodiment, when not departing from the present invention's design, can also comprise other Equivalent embodiments more, and scope of the present invention is determined by appended right.

Claims (13)

1. a data cache method, is characterized in that, comprising:
By the Data import to be visited of application program to buffer memory, and conduct interviews;
When terminating the access to described data, in described buffer memory, retain described data;
Monitor the attribute of described buffer memory and/or the attribute of described data, and according to setting rule, the data in described buffer memory are removed.
2. method according to claim 1, is characterized in that, by Data import to be visited to buffer memory, and conducts interviews, comprising:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, load described data to be visited from local disk and store to described memory cache, and conducted interviews by described application program.
3. method according to claim 1, is characterized in that, by Data import to be visited to buffer memory, and conducts interviews, comprising:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, then from the disk buffering of described application program, search data to be visited;
If search unsuccessfully, then load described data to be visited from network and store to described memory cache and disk buffering, and conducted interviews by described application program.
4. method according to claim 1, is characterized in that, monitors the attribute of described buffer memory, and removes the data in described buffer memory according to setting rule, comprising:
Monitor the memory space of data in described buffer memory, if described memory space reaches setting threshold value, then based on setting rule, the data in described buffer memory are removed.
5., according to the arbitrary described method of claim 1-4, it is characterized in that, also comprise:
If search data to be visited from described buffer memory, then upgrade access time of described data to be visited and search number of success;
Then based on setting rule, the data in described buffer memory are removed, comprising:
According to searching number of success and access time, the data in described buffer memory are removed.
6. according to the arbitrary described method of claim 1-4, it is characterized in that, based on setting rule, the data in described buffer memory removed, comprising:
According to setting cycle, the data stored in described buffer memory are removed.
7., according to the arbitrary described method of claim 1-4, it is characterized in that, also comprise:
Getting after internal memory warning prompt or described application program exit foreground mode, empty the data stored in described buffer memory.
8., according to the arbitrary described method of claim 1-4, it is characterized in that, described data to be visited are image data, voice data or video data.
9. a data buffer storage device, is characterized in that, comprising:
Data buffer storage and addressed location, for by the Data import to be visited of application program to buffer memory, and to conduct interviews;
Data stick unit, for when terminating the access to described data, retains described data in described buffer memory;
Data dump unit, for the attribute of the attribute and/or described data of monitoring described buffer memory, and removes the data in described buffer memory according to setting rule.
10. device according to claim 9, is characterized in that, described data buffer storage and addressed location, specifically for:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, load described data to be visited from local disk and store to described memory cache, and conducted interviews by described application program.
11. devices according to claim 9, is characterized in that, described data buffer storage and addressed location, specifically for:
If get the data access request of application program, then from the memory cache of local system, search data to be visited;
If search unsuccessfully, then from the disk buffering of described application program, search data to be visited;
If search unsuccessfully, then load described data to be visited from network and store to described memory cache and disk buffering, and conducted interviews by described application program.
12. devices according to claim 9, it is characterized in that, described data dump unit, specifically for: the memory space of monitoring data in described buffer memory, if described memory space reaches setting threshold value, then based on setting rule, the data in described buffer memory are removed.
13., according to the arbitrary described device of claim 9-12, is characterized in that, also comprise:
Updating block, if search data to be visited for described data buffer storage and addressed location from described buffer memory, then upgrades access time of described data to be visited and searches number of success;
Described data dump unit, specifically for:
According to searching number of success and access time, the data in described buffer memory are removed.
CN201510223929.5A 2015-05-05 2015-05-05 data cache method and device Active CN104808952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510223929.5A CN104808952B (en) 2015-05-05 2015-05-05 data cache method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510223929.5A CN104808952B (en) 2015-05-05 2015-05-05 data cache method and device

Publications (2)

Publication Number Publication Date
CN104808952A true CN104808952A (en) 2015-07-29
CN104808952B CN104808952B (en) 2018-09-18

Family

ID=53693814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510223929.5A Active CN104808952B (en) 2015-05-05 2015-05-05 data cache method and device

Country Status (1)

Country Link
CN (1) CN104808952B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105279267A (en) * 2015-10-23 2016-01-27 广州视睿电子科技有限公司 Data caching method and device
CN105335520A (en) * 2015-11-24 2016-02-17 北京交控科技有限公司 Data processing method based on subway integrated automation system and processor
CN105354088A (en) * 2015-09-29 2016-02-24 广州酷狗计算机科技有限公司 Message deleting method and apparatus
CN105513005A (en) * 2015-12-02 2016-04-20 魅族科技(中国)有限公司 Memory management method and terminal
CN105589926A (en) * 2015-11-27 2016-05-18 深圳市美贝壳科技有限公司 Method for clearing cache files of mobile terminal in real time
CN105786723A (en) * 2016-03-14 2016-07-20 深圳创维-Rgb电子有限公司 Application cache management method and device based on linked list
CN106569894A (en) * 2016-10-11 2017-04-19 北京元心科技有限公司 Picture loading method and system
CN106951550A (en) * 2017-03-27 2017-07-14 广东欧珀移动通信有限公司 Data processing method, device and mobile terminal
CN107122247A (en) * 2017-04-27 2017-09-01 腾讯科技(深圳)有限公司 A kind of static detection method and device for taking picture
CN108733489A (en) * 2018-05-11 2018-11-02 五八同城信息技术有限公司 Data processing method, device, electronic equipment and storage medium
CN110018912A (en) * 2018-01-10 2019-07-16 武汉斗鱼网络科技有限公司 Data cache method, storage medium, equipment and the system for having informing function
CN111159240A (en) * 2020-01-03 2020-05-15 中国船舶重工集团公司第七0七研究所 Efficient data caching processing method based on electronic chart

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833512A (en) * 2010-04-22 2010-09-15 中兴通讯股份有限公司 Method and device thereof for reclaiming memory
CN102368258A (en) * 2011-09-30 2012-03-07 广州市动景计算机科技有限公司 Webpage page caching management method and system
CN103281397A (en) * 2013-06-13 2013-09-04 苏州联讯达软件有限公司 Data-caching method and system based on timestamps and access density
CN103631616A (en) * 2013-08-28 2014-03-12 广州品唯软件有限公司 Method and system for fast loading and caching of picture

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833512A (en) * 2010-04-22 2010-09-15 中兴通讯股份有限公司 Method and device thereof for reclaiming memory
CN102368258A (en) * 2011-09-30 2012-03-07 广州市动景计算机科技有限公司 Webpage page caching management method and system
CN103281397A (en) * 2013-06-13 2013-09-04 苏州联讯达软件有限公司 Data-caching method and system based on timestamps and access density
CN103631616A (en) * 2013-08-28 2014-03-12 广州品唯软件有限公司 Method and system for fast loading and caching of picture

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105354088A (en) * 2015-09-29 2016-02-24 广州酷狗计算机科技有限公司 Message deleting method and apparatus
CN105354088B (en) * 2015-09-29 2018-11-27 广州酷狗计算机科技有限公司 Message delet method and device
CN105279267A (en) * 2015-10-23 2016-01-27 广州视睿电子科技有限公司 Data caching method and device
CN105335520B (en) * 2015-11-24 2018-11-16 交控科技股份有限公司 A kind of data processing method and processor based on Subway Integrated Automatic System
CN105335520A (en) * 2015-11-24 2016-02-17 北京交控科技有限公司 Data processing method based on subway integrated automation system and processor
CN105589926A (en) * 2015-11-27 2016-05-18 深圳市美贝壳科技有限公司 Method for clearing cache files of mobile terminal in real time
CN105513005B (en) * 2015-12-02 2019-01-29 魅族科技(中国)有限公司 A kind of method and terminal of memory management
CN105513005A (en) * 2015-12-02 2016-04-20 魅族科技(中国)有限公司 Memory management method and terminal
CN105786723A (en) * 2016-03-14 2016-07-20 深圳创维-Rgb电子有限公司 Application cache management method and device based on linked list
CN106569894A (en) * 2016-10-11 2017-04-19 北京元心科技有限公司 Picture loading method and system
CN106951550A (en) * 2017-03-27 2017-07-14 广东欧珀移动通信有限公司 Data processing method, device and mobile terminal
CN107122247A (en) * 2017-04-27 2017-09-01 腾讯科技(深圳)有限公司 A kind of static detection method and device for taking picture
CN107122247B (en) * 2017-04-27 2021-11-02 腾讯科技(深圳)有限公司 Method and device for detecting static occupied picture
CN110018912A (en) * 2018-01-10 2019-07-16 武汉斗鱼网络科技有限公司 Data cache method, storage medium, equipment and the system for having informing function
CN108733489A (en) * 2018-05-11 2018-11-02 五八同城信息技术有限公司 Data processing method, device, electronic equipment and storage medium
CN111159240A (en) * 2020-01-03 2020-05-15 中国船舶重工集团公司第七0七研究所 Efficient data caching processing method based on electronic chart

Also Published As

Publication number Publication date
CN104808952B (en) 2018-09-18

Similar Documents

Publication Publication Date Title
CN104808952A (en) Data caching method and device
CN111159436B (en) Method, device and computing equipment for recommending multimedia content
US20160140035A1 (en) Memory management techniques
US10073649B2 (en) Storing metadata
CN108701079A (en) The system and method that flash memory with adaptive prefetching reads cache
US9400754B2 (en) Asynchronous swap mechanism and page eviction from memory
CN105512251A (en) Page cache method and device
CN110895524B (en) Composite overdue method, device, server and storage medium of full-load redis time key
CN106406925A (en) An apparatus and a method used for supporting online upgrade
CN104281468A (en) Method and system for distributed virtual machine image management
CN107197359B (en) Video file caching method and device
CN104572845A (en) File distribution method and device, equipment and system
CN103607312A (en) Data request processing method and system for server system
CN103473326A (en) Method and device providing searching advices
CN105138473A (en) System and method for managing cache
CN108984130A (en) A kind of the caching read method and its device of distributed storage
CN108762916B (en) Memory management method, device and equipment and computer readable storage medium
KR102502569B1 (en) Method and apparuts for system resource managemnet
US20210311770A1 (en) Method for implementing smart contract based on blockchain
CN110209341B (en) Data writing method and device and storage equipment
CN102073463A (en) Flow prediction method and device, and prereading control method and device
CN113961346A (en) Data cache management and scheduling method and device, electronic equipment and storage medium
CN105573782B (en) A kind of software pre-add support method for transparent wearable smart machine
CN110990133A (en) Edge computing service migration method and device, electronic equipment and medium
CN115328406A (en) Data writing and acquiring method and device, electronic equipment and computer medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant