CN104808952B - data cache method and device - Google Patents
data cache method and device Download PDFInfo
- Publication number
- CN104808952B CN104808952B CN201510223929.5A CN201510223929A CN104808952B CN 104808952 B CN104808952 B CN 104808952B CN 201510223929 A CN201510223929 A CN 201510223929A CN 104808952 B CN104808952 B CN 104808952B
- Authority
- CN
- China
- Prior art keywords
- data
- caching
- visited
- picture
- application program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The embodiment of the invention discloses a kind of data cache method and devices.Wherein, the method includes:The data to be visited of application program are loaded onto caching, and are accessed;When terminating the access to the data, the data are retained in the caching;The attribute of the caching and/or the attribute of the data are monitored, and the data in the caching are purged according to setting rule.Technical solution provided in an embodiment of the present invention can be accelerated the access time subsequently to data, and EMS memory occupation is reduced.
Description
Technical field
The present embodiments relate to field of computer technology more particularly to data cache methods and device.
Background technology
Existing application program is usually directed to following three kinds of load modes in carrying out local picture loading procedure:
First way [UIImage imageNamed:] it is that local picture is loaded into Installed System Memory and is cached
And show, when application program exits, remove caching;
The second way [UIImage imageWithContentsOfFile:] be, directly using local picture as file
It reads and shows, do not cache, that is, be loaded directly into picture, do not cache, still directly read when loading the picture again after waiting
It takes the picture file and shows, do not cache;
The third mode [UIImage imageWithData:] be, first by the format conversion of local picture be data lattice
Formula, then loaded and shown, which does not also execute caching, still first by picture when loading the picture again after waiting
Format conversion is that data format is loaded and shown again, is not cached.
Existing application program is to the load mode of network picture:Network image data is downloaded in Installed System Memory,
And the loaded and displayed picture;When no longer holding the picture memory, which can be recovered in the system;After being recovered, when again
When accessing the network picture, network image data simultaneously loaded and displayed network picture can be re-downloaded.
But there is certain drawback in above-mentioned three kinds of load modes to local picture.Specifically, for first way
For, when needing to access local picture again, although access speed can be accelerated by way of directly reading caching,
It is to occupy Installed System Memory always;It, can although the occupancy to Installed System Memory is less for the second way and the third mode
Speed with timely releasing memory, but when accessing local picture again is slower, and especially the third mode needs to consume additional
Memory carry out format conversion and additional format conversion time.In addition, to the load mode of network picture, there is also again
Secondary access time longer problem.
Invention content
A kind of data cache method of offer of the embodiment of the present invention and device, to accelerate, subsequently to the access time of data, to subtract
Few EMS memory occupation.
On the one hand, an embodiment of the present invention provides a kind of data cache method, this method includes:
The data to be visited of application program are loaded onto caching, and are accessed;
When terminating the access to the data, the data are retained in the caching;
The attribute of the caching and/or the attribute of the data are monitored, and according to setting rule to the number in the caching
According to being purged.
On the other hand, the embodiment of the present invention additionally provides a kind of data buffer storage device, which includes:
Data buffer storage and access unit for the data to be visited of application program to be loaded onto caching, and access;
Data stick unit, for when terminating the access to the data, the data to be retained in the caching;
Data dump unit, the attribute of attribute and/or the data for monitoring the caching, and according to setting rule
Data in the caching are purged.
Technical solution provided in an embodiment of the present invention does not remove the data immediately after terminating a data access,
But continue to retain the data in the buffer, until subsequently accessing the data again, the data are directly read from caching, from
And data access time is accelerated, while data cached removing can also be carried out based on setting strategy, it can avoid continuing in this way
Occupy excessive memory headroom.
Description of the drawings
Fig. 1 is a kind of flow diagram for data cache method that the embodiment of the present invention one provides;
Fig. 2 is that the memory for loading 100 different pictures respectively with three kinds in the prior art local picture load modes accounts for
Use distribution schematic diagram;
Fig. 3 is to access 100 same occupied Annual distribution signals of pictures under above-mentioned three kinds of load modes respectively
Figure;
Fig. 4 is a kind of flow diagram for image cache method that the embodiment of the present invention three provides;
Fig. 5 is the structural schematic diagram for the data buffer storage device that the embodiment of the present invention four provides.
Specific implementation mode
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining the present invention rather than limitation of the invention.It also should be noted that in order to just
Only the parts related to the present invention are shown in description, attached drawing rather than entire infrastructure.
It should be mentioned that some exemplary embodiments are described as before exemplary embodiment is discussed in greater detail
The processing described as flow chart or method.Although operations (or step) are described as the processing of sequence by flow chart,
Many of which operation can be implemented concurrently, concomitantly or simultaneously.In addition, the sequence of operations can be pacified again
Row.The processing can be terminated when its operations are completed, it is also possible to the additional step being not included in attached drawing.Institute
It states processing and can correspond to method, function, regulation, subroutine, subprogram etc..
It should further be mentioned that in some replace implementations, the function action being previously mentioned can be according to different from attached
The sequence indicated in figure occurs.For example, involved function action is depended on, each width figure shown in succession actually may be used
Substantially simultaneously to execute or can execute in a reverse order sometimes.
Embodiment one
Fig. 1 is a kind of flow diagram for data cache method that the embodiment of the present invention one provides.The present embodiment is applicable
It is visited in the terminal device to such as smart mobile phone, tablet computer, laptop, desktop computer or personal digital assistant etc
The data asked carry out the case where cache management, to accelerate follow-up data access time, reduce the occupancy to memory.This method can be with
It is executed by data buffer storage device, described device is built on the terminal device by software realization.Referring to Fig. 1, the present embodiment carries
The data cache method of confession specifically comprises the following steps:
Step S110, the data to be visited of application program are loaded onto caching, and accessed.
Step S120, when terminating to the access of data, retention data in the buffer.
Wherein, the data to be obtained that data to be visited determine for terminal device according to current accessed demand, such as can
To be application program on the terminal device data to be accessed when in this operating status of foreground mode.The data are preferred
It is picture, naturally it is also possible to be other data of such as audio or video etc.Also, data to be visited are the following two kinds number
At least one of according to:The local data being stored in local disk, and the network number that is stored in internet on server
According to.
In the prior art, after getting data access request, a kind of loading scheme is to be loaded onto data to be visited
Caching, to shorten next data access time, but the program is not managed and is safeguarded to caching, only simply to caching
In increase data, occupy more and more, the problem of bringing memory to be critical so as to cause the Installed System Memory of terminal device;It is another kind of to add
Load scheme does not cache data to be visited then, although in this way will not committed memory be both needed in subsequently data access each time
Data are reloaded, access time is longer, and in the case of data to be visited are network data, easily occurs what flow exploded
Phenomenon.
For this purpose, the present embodiment proposes a kind of new improvement project:First data to be visited are added from local disk or network
It is loaded onto caching, is accessed, later when terminating the access, does not remove the data, and is to continue with and retains the data;Meanwhile
Data management is carried out to caching in real time, to dispose some of which data, reduces memory pressure.The present embodiment can solve
Occupancy when reducing because of load different data to memory can not be taken into account simultaneously in the prior art and shortened, same data are repeatedly visited
The drawbacks of asking the occupied time.
Caching can be memory cache and/or disk buffering.Wherein, memory cache for distributed in memory one piece specially
Amount of physical memory of the door for storing data to be visited;Equally, disk buffering is is distributed on terminal device local disk
One piece dedicated for storing the amount of physical memory of data to be visited.Preferably, it can will belong to the number to be visited of local data
According to memory cache is loaded onto, the data to be visited of network data will be belonged to while being loaded onto in memory cache and disk buffering.Tool
Body, two cache pools can be created in memory cache:Local data memory cache pond (for safeguarding local data) and network
Datarams cache pool (for safeguarding network data) creates a network data disk buffering pond in local disk caching
(for safeguarding network data).
In a kind of specific implementation mode of the present embodiment, data to be visited are local data, correspondingly, by number to be visited
It caches, and accesses according to being loaded onto, including:If getting the data access request of application program, from local system
Data to be visited are searched in memory cache;If searching failure, from local disk load in data to memory cache to be visited into
Row storage, and accessed by application program.
In another specific implementation mode of the present embodiment, data to be visited are network data, correspondingly, by be visited
Data are loaded onto caching, and access, including:If getting the data access request of application program, from local system
Memory cache in search data to be visited;If searching failure, number to be visited is searched from the disk buffering of application program
According to;If searching failure, loads in data to memory cache and disk buffering to be visited and stored from network, and by answering
It is accessed with program.
Preferably, if searched successfully, the data are directly directly accessed by application program.
Step S130, monitoring caching attribute and/or data attribute, and according to setting rule to the data in caching into
Row is removed.
Wherein, the attribute of caching can be that the amount of storage of data or terminal device are fast to the read-write of caching in caching
Degree.The attribute of data, can be the type, size, visitation frequency (access times in the unit interval) of data, access time,
Successful number (referred to as searching number of success namely hit-count) etc. is searched in the buffer.In the present embodiment, cache attribute
Monitoring can be considered to be a kind of detection trigger of data dump.Illustratively, the amount of storage of data is more than setting in the buffer
Capacity threshold when, alternatively, when terminal device is slower than the read or write speed of caching the speed threshold value of setting, advised according to setting
Then the data in caching are purged.
It is of course also possible to not consider the attribute of caching, it is based only upon the attribute of data, according to setting rule to the number in caching
According to being purged, such as the lookup number of success of data and access time are monitored in real time, as long as the number of success is low
It is more than the time interval of setting apart from current time in the number threshold value of setting, or the last access time, just to this
Data execute clear operation.For another example after monitoring the visitation frequency to data less than setting frequency threshold value, the number is removed
According to.
Technical solution provided in this embodiment does not remove the data immediately after terminating a data access, but
Continue to retain the data in the buffer, until subsequently accessing the data again, the data are directly read from caching, to add
Fast data access time, while data cached removing can also be carried out based on setting strategy, it can avoid lasting occupancy in this way
Excessive memory headroom.
Embodiment two
The present embodiment on the basis of the above embodiment 1, advanced optimizes step S130, with can to cache into
Row is rational in time to be removed, and EMS memory occupation is reduced.
In the present embodiment, step S130 is preferably:The amount of storage of data in monitoring caching, if amount of storage reaches setting
Threshold value is then purged the data in caching based on setting rule.Wherein, setting threshold value can be with the storage of terminal device
Amount of capacity, processor performance, specific scene demand (such as having the application program type of data access demand) etc. are because being known as
It closes, can be pre-set by developer, alternatively, being set in actual application by user's dynamic.
Illustratively, data cache method provided in an embodiment of the present invention further includes:If searched from caching to be visited
Data then update the access time to data to be visited and search number of success.Specifically, often search primary data to be visited at
After work(, just the lookup number of success of the data to be visited is added up to add 1, access time is updated to this and searches the time.Then it is based on
Setting rule is purged the data in caching, including:According to number of success and access time is searched, to the data in caching
It is purged.For example, the data that following condition is met in caching can be removed:Number of success is searched less than setting
Number threshold value;And/or the last access time is more than the time interval of setting apart from current time.
For clearer statement data cached cleaning scheme provided in this embodiment, now illustrate.For example, without removing
Before operation, the data information stored in caching is as shown in table 1 below:
Table 1
Data | The last access time | Search number of success | Amount of storage in the buffer |
Local data A | 2016.1.1 day 8:00 | 7 | 2.7M |
Network data B | 2016.1.1 day 9:03 | 10 | 0.5M |
Network data C | 2016.1.1 day 9:06 | 5 | 1.5M |
Local data D | 2016.1.1 day 10:21 | 15 | 1M |
Local data E | 2016.1.1 day 10:23 | 2 | 2M |
Network data F | 2016.1.1 day 11:30 | 8 | 0.3M |
If current time is 2016.1.1 days 12:00, for the number threshold value set as 6, the time interval set is small as 3
When, then it will can wherein search number of success the network data C less than 6 and local data E and remove, wherein the last will access
The local data A that time gap current time is more than 3 hours is removed.The number stored in being cached after this clear operation
It is believed that breath is as shown in table 2 below:
Table 2
Data | The last access time | Search number of success | Amount of storage in the buffer |
Network data B | 2016.1.1 day 9:03 | 10 | 0.5M |
Local data D | 2016.1.1 day 10:21 | 15 | 1M |
Network data F | 2016.1.1 day 11:30 | 8 | 0.3M |
By Tables 1 and 2 as it can be seen that the amount of storage for removing data in front and back caching is respectively:2.7+0.5+1.5+1+2+0.3=
8M (before removing), 0.5+1+0.3=1.8M (after removing).After executing data-cleaning operation to caching, data deposits in caching
Reserves reduce 6.2M, the amount of storage of reduction can be reserved for other use by terminal device.
Using above-mentioned example provide data cached cleaning scheme, can eliminate in time caching in for a long time be not accessed with
And hit-count less data is searched, this have the advantage that:It can ensure that the data retained in caching are always frequency
Valid data numerous using and more hit-count can accelerate the access time subsequently to these valid data.
It certainly, it will be recognized by one of ordinary skill in the art that can also be by other means based on setting rule in caching
Data are purged, for example, according to setting period (such as 10 minutes), are purged to the data stored in caching.Due to the party
Formula is not related to the monitoring of data attribute, it is possible to accelerate removing speed, timely releasing memory.
In embodiments of the present invention, it is application program in this operating status when institute of foreground mode in data to be visited
Under this application scenarios of the data to be accessed, caching method may also include:
After getting memory warning prompt or application program exit foreground mode, the data for the middle storage that empties the cache.
Embodiment three
With the rapid development of mobile Internet, mobile terminal is critical using the memory brought explodes problem increasingly with flow
As the pain spot of user, and the memory and problems of liquid flow wherein caused by loading and accessing picture become the most important thing.
The modes for the local picture of three kinds of loads for illustrating that above-mentioned background technology part is previously mentioned respectively are tested with two below
In EMS memory occupation and the different manifestations in terms of access time.
First for loading in 100 different pictures to variable dictionary NSMutableDictionary.Under original state
(when not Loading Image), it is 3M that Installed System Memory, which uses,.When loading 100 pictures, (same pictures, name is different, and format is
Jpg after), the memory of first way (mode 1) and the second way (mode 2) described in background technology is using rising to
8M, it is average to increase memory 50K per pictures;The memory use of the third mode (mode 3) rises to 10M, average per pictures
Increase memory and uses 70K.When all pictures removed in NSMutableDictionary (namely terminate visit to these pictures
Ask) when:The memory of first way is not using reducing;The second way and the third mode cache picture due to no,
So memory use can be reduced, specifically, the EMS memory occupation of the second way is reduced to 4.5M, the EMS memory occupation of the third mode
It is reduced to 4M.It is specific as shown in Figure 2.
It follows that when being Loaded Image using first way, system can cache in picture to memory, and memory is not
It can be reduced with the removal of object picture.When being needed in application program using a large amount of pictures, first way can cause interior
Deposit rising suddenly and sharply for occupancy.The EMS memory occupation of the second way and the third mode can be reduced with the removal of object picture.But the
Three kinds of modes can cause additional memory to increase when Loading Image.
Then it reattempts and accesses 100 same pictures, the total time that record access occupies.As shown in Figure 3.Due to system
To the caching of picture, the access time of first way is far less than the second way and the third mode (referring to the class in Fig. 3
It is other 1).If (directly accessing and having cached after hit caching after increasing a caching for the picture inside application program
Picture, otherwise still access picture using former mode), the access time of three kinds of modes all greatly reduces, and nearly all reaches
0.01s (referring to the classification 2 in Fig. 3).
In view of this, the present embodiment is this concrete scene of picture for data based on above-mentioned all embodiments,
One preferred embodiment is provided.The present embodiment is applicable to the terminals such as smart mobile phone or tablet computer to being equipped with IOS systems
The picture that application program in equipment is accessed carries out the case where cache management, to accelerate the picture access time, reduces to memory
Occupancy.This method can be executed by image cache device, and described device can be used as one of application program by software realization
Point, or independently of application program, be built on the terminal device.
Scheme provided in this embodiment:Using cache management mechanism, shortens the access time that repetition Loads Image and subtract
Flow waste is lacked;The difference of caching is required for local picture and network picture, flexible cache policy can be worked out;When
When additional caching causes memory pressure, is eliminated according to strategy and be not used and access at most less image cache.The program is
Mobile terminal application characteristic service, it is flexibly customizable, it is managed collectively image cache, solves the mobile terminal for perplexing user for a long time
Application memory occupies and problems of liquid flow, improves user experience.
In the present embodiment, picture is divided into two kinds of pictures of local picture and network picture.For the local picture of unified management and
The caching of network picture, is adjusted flexibly cache policy, and the present embodiment is directed to above two picture and uses different caching plans respectively
Slightly.
Referring to Fig. 4, image cache method provided in this embodiment specifically comprises the following steps:
Step S410, the picture access request of application program is got.
Step S420, judge whether picture to be visited is local picture.If so, executing step S430a-S470a, otherwise
Execute step S430b-S470b.
Wherein, step S430a-S470a is the specific strategy of local image cache scheme.In this scenario, slow in memory
Middle establishment local picture memory cache pond is deposited, is safeguarded there are three dictionary object in the cache pool, is local picture dictionary respectively, visits
It asks time dictionary and searches number of success dictionary.S410 is being executed the step, and after judging picture to be visited for local picture, held
Row step S430a-S470a.
Step S430a, search whether that there are pictures to be visited from local picture memory cache pond.If so, executing step
Rapid S450a, it is no to then follow the steps S440a.
Step S440a, it loads in picture to local picture memory cache pond to be visited and is stored from local disk.Its
In, load mode can be the second way described in background technology part.Execute step S450a.
Step S450a, it updates the access time in local picture memory cache pond to picture to be visited and searches successfully secondary
Number.
Step S460a, judge whether the amount of storage of data in local picture memory cache pond reaches the first setting threshold value
(such as 5M).If so, S470a is thened follow the steps, it is no to then follow the steps S480.
Step S470a, corresponding object picture in local picture memory cache pond is eliminated according to the first strategy, until this is slow
Deposit the amount of storage of data in pond the first safety value (such as 1M) below.Execute step S480.
Wherein, the first strategy is:It is searched in local picture dictionary in preferential superseded local picture memory cache pond successfully secondary
Number is few and access times object picture earlier.Specifically superseded method, reference can be made to the embodiment of the present invention one and embodiment two carry
The associated description being purged to the data in caching supplied, details are not described herein.
After getting memory warning prompt or application program exits foreground mode, local picture memory cache pond is emptied
In all object pictures.
Step S430b-S470b is the specific strategy of network picture buffering scheme.In this scenario, it is created in memory cache
Establishing network picture memory cache pond safeguards that there are one network picture dictionary objects in the cache pool.Meanwhile in the magnetic of application program
Network picture disk buffering pond is created in disk caching.S410 is being executed the step, and it is local picture to judge picture to be visited not
Afterwards, it determines that the picture is network picture, executes step S430b-S470b.
Step S430b, search whether that there are pictures to be visited from network picture memory cache pond.If so, executing step
Rapid S480, it is no to then follow the steps S440b.
Step S440b, search whether that there are pictures to be visited from network picture disk buffering pond.If so, executing step
Rapid S480, it is no to then follow the steps S450b.
Step S450b, picture to be visited is loaded from network, and stored simultaneously slow to network picture memory cache pond and disk
Deposit pond.
Step S460b, judge whether the amount of storage of data in network picture memory cache pond reaches the second setting threshold value
(such as 10M).If so, S470b is thened follow the steps, it is no to then follow the steps S480.
Step S470b, according to corresponding object picture in the superseded network picture memory cache pond of the second strategy, until this is slow
Deposit the amount of storage of data in pond the second safety value (such as 2M) below.Execute step S480.
Wherein, the second strategy is:Eliminate object picture all in network picture memory cache pond.It preferably, can be regular
(such as three days) empty network picture disk buffering pond.
After getting memory warning prompt or application program exits foreground mode, network picture memory cache pond is emptied
In all object pictures.
Step S480, picture to be visited is returned, so that application program accesses, is terminated.
Technical solution provided in this embodiment has following advantage:
1 while the caching of local picture and network picture is managed, and cache policy can be adjusted flexibly.
2, it is directed to local picture, the memory cache of local picture is safeguarded, substantially reduces the access time of local picture, and
Corresponding cache object can be eliminated according to different opportunitys, to reduce EMS memory occupation.
3, it is directed to network picture, the memory cache and disk buffering of network picture is safeguarded, substantially reduces network picture
Access time saves flow, and can eliminate corresponding cache object according to different opportunitys, to reduce EMS memory occupation.
4, different replacement policies is devised according to different picture/mb-types.The replacement policy of local image cache considers
Access time and hit-count, preferential eliminate are not used and hit at most less caching.
Example IV
Fig. 5 is the structural schematic diagram for the data buffer storage device that the embodiment of the present invention four provides.Referring to Fig. 5, the tool of the device
Body structure is as follows:
Data buffer storage and access unit 510 for the data to be visited of application program to be loaded onto caching, and are visited
It asks;
Data stick unit 520, for when terminating the access to the data, the number to be retained in the caching
According to;
Data dump unit 530, the attribute of attribute and/or the data for monitoring the caching, and according to setting
Rule is purged the data in the caching.
In a kind of specific implementation mode of the present embodiment, the data buffer storage and access unit 510 are specifically used for:
If getting the data access request of application program, number to be visited is searched from the memory cache of local system
According to;
If searching failure, loads in the data to be visited to the memory cache and stored from local disk, and
It is accessed by the application program.
In another specific implementation mode of the present embodiment, the data buffer storage and access unit 510 are specifically used for:
If getting the data access request of application program, number to be visited is searched from the memory cache of local system
According to;
If searching failure, data to be visited are searched from the disk buffering of the application program;
If searching failure, loads in the data to be visited to the memory cache and disk buffering and carry out from network
Storage, and accessed by the application program.
Illustratively, the data dump unit 530, is specifically used for:The amount of storage of data in the caching is monitored, if
The amount of storage reaches setting threshold value, then is purged to the data in the caching based on setting rule.
Illustratively, based on the above technical solution, device provided in this embodiment further includes:
Updating unit 540, if searching number to be visited from the caching for the data buffer storage and access unit 510
According to then update to access times of the data to be visited and searches number of success;
The data dump unit 530, is specifically used for:
According to number of success and access time is searched, the data in the caching are purged.
Illustratively, data dump unit 530 is specifically used for:
According to the setting period, the data stored in the caching are purged.
Illustratively, data dump unit 530, is additionally operable to:Getting memory warning prompt or the application program
After exiting foreground mode, the data stored in the caching are emptied.
Based on the above technical solution, the data to be visited are image data, audio data or video data.
The said goods can perform the method that any embodiment of the present invention is provided, and have the corresponding function module of execution method
And advantageous effect.
Note that above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that
The present invention is not limited to specific embodiments described here, can carry out for a person skilled in the art it is various it is apparent variation,
It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out to the present invention by above example
It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also
May include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.
Claims (13)
1. a kind of data cache method, which is characterized in that including:
The data to be visited of application program are loaded onto caching, and are accessed;
When terminating the access to the data, the data are retained in the caching;
Monitor the attribute of the caching and/or the attribute of the data, and according to setting rule to the data in the caching into
Row is removed;
Wherein, the setting rule includes:For local picture, eliminate in the picture memory cache pond of local in local picture dictionary
Search that number of success is few and access time picture earlier, until the amount of storage of data in local picture memory cache pond is first
Below safety value;For network picture, object picture all in network picture memory cache pond is eliminated, until in network picture
The amount of storage of data in cache pool is deposited in the second safety value hereinafter, second safety value is more than first safety value.
2. according to the method described in claim 1, it is characterized in that, data to be visited are loaded onto caching, and accessing, wrap
It includes:
If getting the data access request of application program, data to be visited are searched from the memory cache of local system;
If searching failure, loads in the data to be visited to the memory cache and stored from local disk, and pass through
The application program accesses.
3. according to the method described in claim 1, it is characterized in that, data to be visited are loaded onto caching, and accessing, wrap
It includes:
If getting the data access request of application program, data to be visited are searched from the memory cache of local system;
If searching failure, data to be visited are searched from the disk buffering of the application program;
If searching failure, loads in the data to be visited to the memory cache and disk buffering and deposited from network
Storage, and accessed by the application program.
4. according to the method described in claim 1, it is characterized in that, monitor the attribute of the caching, and it is right according to setting rule
Data in the caching are purged, including:
The amount of storage of data in the caching is monitored, it is right based on setting rule if the amount of storage reaches setting threshold value
Data in the caching are purged.
5. according to any methods of claim 1-4, which is characterized in that further include:
If searching data to be visited from the caching, updates the access time to the data to be visited and search successfully
Number;
Then the data in the caching are purged based on setting rule, including:
According to number of success and access time is searched, the data in the caching are purged.
6. according to any methods of claim 1-4, which is characterized in that based on setting rule to the data in the caching
It is purged, including:
According to the setting period, the data stored in the caching are purged.
7. according to any methods of claim 1-4, which is characterized in that further include:
After getting memory warning prompt or the application program exits foreground mode, the number stored in the caching is emptied
According to.
8. according to any methods of claim 1-4, which is characterized in that the data to be visited are image data, audio
Data or video data.
9. a kind of data buffer storage device, which is characterized in that including:
Data buffer storage and access unit for the data to be visited of application program to be loaded onto caching, and access;
Data stick unit, for when terminating the access to the data, the data to be retained in the caching;
Data dump unit, the attribute of attribute and/or the data for monitoring the caching, and according to setting rule to institute
The data stated in caching are purged;
Wherein, the setting rule includes:For local picture, eliminate in the picture memory cache pond of local in local picture dictionary
Search that number of success is few and access time picture earlier, until the amount of storage of data in local picture memory cache pond is first
Below safety value;For network picture, object picture all in network picture memory cache pond is eliminated, until in network picture
The amount of storage of data in cache pool is deposited in the second safety value hereinafter, second safety value is more than first safety value.
10. device according to claim 9, which is characterized in that the data buffer storage and access unit are specifically used for:
If getting the data access request of application program, data to be visited are searched from the memory cache of local system;
If searching failure, loads in the data to be visited to the memory cache and stored from local disk, and pass through
The application program accesses.
11. device according to claim 9, which is characterized in that the data buffer storage and access unit are specifically used for:
If getting the data access request of application program, data to be visited are searched from the memory cache of local system;
If searching failure, data to be visited are searched from the disk buffering of the application program;
If searching failure, loads in the data to be visited to the memory cache and disk buffering and deposited from network
Storage, and accessed by the application program.
12. device according to claim 9, which is characterized in that the data dump unit is specifically used for:Described in monitoring
The amount of storage of data in caching, if the amount of storage reaches setting threshold value, based on setting rule in the caching
Data are purged.
13. according to any devices of claim 9-12, which is characterized in that further include:
Updating unit updates if searching data to be visited from the caching for the data buffer storage and access unit
Access time to the data to be visited and lookup number of success;
The data dump unit, is specifically used for:
According to number of success and access time is searched, the data in the caching are purged.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510223929.5A CN104808952B (en) | 2015-05-05 | 2015-05-05 | data cache method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510223929.5A CN104808952B (en) | 2015-05-05 | 2015-05-05 | data cache method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104808952A CN104808952A (en) | 2015-07-29 |
CN104808952B true CN104808952B (en) | 2018-09-18 |
Family
ID=53693814
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510223929.5A Active CN104808952B (en) | 2015-05-05 | 2015-05-05 | data cache method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104808952B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105354088B (en) * | 2015-09-29 | 2018-11-27 | 广州酷狗计算机科技有限公司 | Message delet method and device |
CN105279267A (en) * | 2015-10-23 | 2016-01-27 | 广州视睿电子科技有限公司 | Data caching method and device |
CN105335520B (en) * | 2015-11-24 | 2018-11-16 | 交控科技股份有限公司 | A kind of data processing method and processor based on Subway Integrated Automatic System |
CN105589926A (en) * | 2015-11-27 | 2016-05-18 | 深圳市美贝壳科技有限公司 | Method for clearing cache files of mobile terminal in real time |
CN105513005B (en) * | 2015-12-02 | 2019-01-29 | 魅族科技(中国)有限公司 | A kind of method and terminal of memory management |
CN105786723A (en) * | 2016-03-14 | 2016-07-20 | 深圳创维-Rgb电子有限公司 | Application cache management method and device based on linked list |
CN106569894A (en) * | 2016-10-11 | 2017-04-19 | 北京元心科技有限公司 | Picture loading method and system |
CN106951550B (en) * | 2017-03-27 | 2020-06-05 | Oppo广东移动通信有限公司 | Data processing method and device and mobile terminal |
CN107122247B (en) * | 2017-04-27 | 2021-11-02 | 腾讯科技(深圳)有限公司 | Method and device for detecting static occupied picture |
CN110018912A (en) * | 2018-01-10 | 2019-07-16 | 武汉斗鱼网络科技有限公司 | Data cache method, storage medium, equipment and the system for having informing function |
CN108733489A (en) * | 2018-05-11 | 2018-11-02 | 五八同城信息技术有限公司 | Data processing method, device, electronic equipment and storage medium |
CN111159240A (en) * | 2020-01-03 | 2020-05-15 | 中国船舶重工集团公司第七0七研究所 | Efficient data caching processing method based on electronic chart |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833512A (en) * | 2010-04-22 | 2010-09-15 | 中兴通讯股份有限公司 | Method and device thereof for reclaiming memory |
CN102368258A (en) * | 2011-09-30 | 2012-03-07 | 广州市动景计算机科技有限公司 | Webpage page caching management method and system |
CN103281397A (en) * | 2013-06-13 | 2013-09-04 | 苏州联讯达软件有限公司 | Data-caching method and system based on timestamps and access density |
CN103631616A (en) * | 2013-08-28 | 2014-03-12 | 广州品唯软件有限公司 | Method and system for fast loading and caching of picture |
-
2015
- 2015-05-05 CN CN201510223929.5A patent/CN104808952B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101833512A (en) * | 2010-04-22 | 2010-09-15 | 中兴通讯股份有限公司 | Method and device thereof for reclaiming memory |
CN102368258A (en) * | 2011-09-30 | 2012-03-07 | 广州市动景计算机科技有限公司 | Webpage page caching management method and system |
CN103281397A (en) * | 2013-06-13 | 2013-09-04 | 苏州联讯达软件有限公司 | Data-caching method and system based on timestamps and access density |
CN103631616A (en) * | 2013-08-28 | 2014-03-12 | 广州品唯软件有限公司 | Method and system for fast loading and caching of picture |
Also Published As
Publication number | Publication date |
---|---|
CN104808952A (en) | 2015-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104808952B (en) | data cache method and device | |
US9779127B2 (en) | Integrating database management system and external cache | |
US9201810B2 (en) | Memory page eviction priority in mobile computing devices | |
US20150317246A1 (en) | Memory Reclamation Method and Apparatus | |
CN105760199B (en) | A kind of application resource loading method and its equipment | |
CN107943718B (en) | Method and device for cleaning cache file | |
CN108363813A (en) | Date storage method, device and system | |
EP3860043A2 (en) | Method and apparatus for implementing smart contract based on blockchain | |
CN110209341B (en) | Data writing method and device and storage equipment | |
CN106649654A (en) | Data updating method and device | |
CN109408149A (en) | Starting method, apparatus, equipment and the storage medium of application program | |
CN108196972A (en) | A kind of restorative procedure of application software, device, terminal and storage medium | |
CN106844236A (en) | The date storage method and device of terminal device | |
CN108205559B (en) | Data management method and equipment thereof | |
CN115344610A (en) | Two-level cache data acquisition method and device | |
CN112286559A (en) | Upgrading method and device for vehicle-mounted intelligent terminal | |
US9857864B1 (en) | Systems and methods for reducing power consumption in a memory architecture | |
KR20150139017A (en) | Apparatus and method for controlling memory | |
CN113127430A (en) | Mirror image information processing method and device, computer readable medium and electronic equipment | |
CN103279562B (en) | A kind of method, device and database storage system for database L2 cache | |
CN110704091A (en) | Firmware upgrading method and device | |
CN102202129B (en) | Method for loading mobile phone operating system | |
CN115237826A (en) | Automatic application cache cleaning method and related equipment | |
CN114490432A (en) | Memory processing method and device, electronic equipment and computer readable storage medium | |
US11494697B2 (en) | Method of selecting a machine learning model for performance prediction based on versioning information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
EXSB | Decision made by sipo to initiate substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |