CN106021445B - It is a kind of to load data cached method and device - Google Patents
It is a kind of to load data cached method and device Download PDFInfo
- Publication number
- CN106021445B CN106021445B CN201610324104.7A CN201610324104A CN106021445B CN 106021445 B CN106021445 B CN 106021445B CN 201610324104 A CN201610324104 A CN 201610324104A CN 106021445 B CN106021445 B CN 106021445B
- Authority
- CN
- China
- Prior art keywords
- data
- caching
- time
- cached
- load
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/957—Browsing optimisation, e.g. caching or content distillation
- G06F16/9574—Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
Abstract
It is a kind of to load data cached method and device, comprising: scan service access layer, identification include the interface of pre-set automatic load information;Include the corresponding data of automatic load information interface and cache with what is recognized according in preset cached parameters information loading of databases;When the data of the business of access are with including the interface of automatic load information corresponding data, the data of caching are read, realize business access processing;Cached parameters information includes: the data separation identification information, and/or caching conditional information, and/or expired time of default settings.The embodiment of the present invention carries out caching process by automatic load information and caching parameter information to data, and a large amount of concurrent operations lead to system snowslide when avoiding due to cache invalidation;Further, judgement processing is ranked up and cached by the data to caching, improves the utilization benefit and data cached reading efficiency of data in caching.
Description
Technical field
Present document relates to but be not limited to data processing technique, it is espespecially a kind of to load data cached method and device.
Background technique
With the continuous expansion of internet scale, Internet user group's is growing, also proposed newly to electric business website
Requirement.When website, which faces million, to be frequently visited by the user, the response speed of system directly affects the experience that user accesses website.
Accelerating website access one of them key technology is improved by using caching technology.On many large-scale websites,
Be widely used Redis (Redis be being write using American National Standards Institute (ANSI) (ANSI) C language an of open source, support network,
It is memory-based also can persistence log type, key assignments (Key-Value) database, and provide multilingual application program compile
Journey interface (API)), memcached (Memcached is a high performance distributed memory target cache system, for dynamic
Webpage (Web) application to mitigate database loads) etc. associated internal memories caching technology.But memory caching technology, which is also limited by, simultaneously is
The physical memory size of system, when opening virtual memory function, if memory is used up, memory caching technology can be infrequently
The data used are stored into disk, this is determined by memory caching technology, is equivalent to operation layer caching exchanging policy
Memory caching technology is transferred to do, has lacked the flexibility of data buffer storage to a certain extent;If virtual memory function is banned
Only, memory caching technology will use the virtual memory of operating system, cause to be related to data cached web site traffic performance meeting
Sharply decline.Memory caching technology can also limit the physical memory that can be used by config option, use when reaching memory
When upper limit threshold values, even if to the writing commands prompt (will continue receive from read command) to make mistake, at this time if there is largely simultaneously
Hair operation, will be directed through caching and accesses the data in data Layer, lead to system snowslide.In addition, in the related technology, caching number
According to cache hit rate it is not high.
To sum up, the memory caching technology now closed has that a large amount of concurrent operations lead to system snowslide when cache invalidation.
Summary of the invention
It is the general introduction to the theme being described in detail herein below.This general introduction is not the protection model in order to limit claim
It encloses.
The embodiment of the present invention provide it is a kind of load data cached method and device, it is big when can be avoided due to cache invalidation
Amount concurrent operations lead to system snowslide.
Data cached device is loaded the embodiment of the invention provides a kind of, comprising: recognition unit, cache unit and reading
Unit;Wherein,
Recognition unit is used for, scan service access layer, and identification includes the interface of pre-set automatic load information;
Cache unit is used for, according in preset cached parameters information loading of databases with recognize include it is automatic plus
It carries the corresponding data of information interface and caches;
Reading unit is used for, and the data of the business of access are and include the corresponding data of the interface of automatic load information
When, the data of caching are read, realize business access processing;
The cached parameters information include: default settings data separation identification information, and/or caching conditional information and/
Or expired time;
The data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters using default
Method function carry out conversion generation.
Optionally, the cache unit is also used to, and the data of the business of the access are and include automatic load information
Interface corresponding data when, if the data of the unread business to the access when data of caching are read, from database
In include that the data of interface of automatic load information are loaded and cached corresponding to data to the business of access.
Optionally, described device further includes updating unit,
Updating unit is used for, when data update, if update data be with include the automatic load information
Interface on data, then update caching in include data corresponding with the business of update.
Optionally, described device further includes caching process unit, for establishing the number of the data comprising all cachings
According to the message queue of status information;
According to preset taking-up strategy, the data of one or more data are read from the message queue of foundation
Status information is handled according to the caching judgement that the data state info of reading carries out data;
The data state info is gathered in advance, the information including at least one of:
The data of the caching in previous request time, and/or,
The load number of the data of the caching and load each time the caching data time, and/or,
The data of the caching are accessed number in preset duration.
Optionally, caching process unit is specifically used for,
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data of one or more data are read from the message queue of foundation
Status information;
The data state info read includes the data of the caching in previous request time,
The difference in previous request time of current time in system and the data of the caching is greater than preset
When requesting interval threshold value, determine caching the data be it is non-hot data cached, will determine as non-hot data cached postpone
Deposit middle deletion;And/or
The data state info read includes the load number of the data of the caching and loads each time described slow
When the time for the data deposited,
If the load number of the data of the caching is greater than the data of preset load frequency threshold value, and/or the caching
Load time be greater than preset load time threshold value, determine that the data of the caching are hot spot but time controllable data, will
It is determined as hot spot but time controllable data is removed from the cache;And/or
When the data state info read includes accessed number of the data of the caching in preset duration,
Accessed number of the data of the caching in the preset duration is less than preset access times threshold value, determines
The data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache.
Optionally, the cache unit is also used to,
When cached parameters information includes expired time, the expired time of the data of the caching is obtained, and from database
Expired time is subtracted handling duration and obtains the advanced processing time by the handling duration for loading and caching the data;
In the advanced processing time of acquisition, if not carrying out the data load of business, institute in simultaneously cache database is loaded
The data of the data cached corresponding update reached when expired time reaches with expired time are stated, and are cached.
Optionally, described device further includes sequencing unit,
Sequencing unit is used for, and to the data of the caching, according to expired time, and/or loads time-consuming duration, and/or number
According to request frequency the data of caching are ranked up.
On the other hand, data cached method is loaded the embodiment of the invention also provides a kind of, comprising:
Scan service access layer, identification include the interface of pre-set automatic load information;
It with recognize include automatic load information interface pair according in preset cached parameters information loading of databases
The data and caching answered;
When the data of the business of access are with including the interface of automatic load information corresponding data, the number of caching is read
According to realization business access processing;
The cached parameters information include: default settings data separation identification information, and/or caching conditional information and/
Or expired time;
The data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters using default
Method function carry out conversion generation.
Optionally, when the data of the business of the access are with including the interface of automatic load information corresponding data,
If read caching data when it is unread to access the business data, the method also includes: from database it is right
It include that the data of the interface of automatic load information are loaded and cached corresponding to the data of the business of access.
Optionally, the method also includes:
When data update, if update data be with include the automatic load information interface on number
According to, then update caching in include data corresponding with the business of update.
Optionally, the method also includes:
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data of one or more data are read from the message queue of foundation
Status information is handled according to the caching judgement that the data state info of reading carries out data;
The data state info is gathered in advance, the information including at least one of:
The data of the caching in previous request time, and/or,
The load number of the data of the caching and load each time the caching data time, and/or,
The data of the caching are accessed number in preset duration.
Optionally, the caching judgement for carrying out data, which is handled, includes:
The data state info includes the data of the caching in previous request time,
The difference in previous request time of current time in system and the data of the caching is greater than preset
When requesting interval threshold value, determine caching the data be it is non-hot data cached, will determine as non-hot data cached postpone
Deposit middle deletion;And/or
The data state info includes the load number of the data of the caching and the number for loading the caching each time
According to time when,
If the load number of the data of the caching is greater than the data of preset load frequency threshold value, and/or the caching
Load time be greater than preset load time threshold value, determine that the data of the caching are hot spot but time controllable data, will
It is determined as hot spot but time controllable data is removed from the cache;And/or
When the data state info includes accessed number of the data of the caching in preset duration,
Accessed number of the data of the caching in the preset duration is less than preset access times threshold value, determines
The data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache.
Optionally, when cached parameters information includes expired time, the method also includes:
When obtaining the expired time of the data of the caching, and loading from database and cache the processing of the data
It is long, expired time is subtracted into handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, institute in simultaneously cache database is loaded
The data of the corresponding update of the data of the caching reached when expired time reaches with expired time are stated, and are cached.
Optionally, the method also includes:
To the data of caching, according to expired time, and/or the request frequency for loading time-consuming duration, and/or data to described
The data of caching are ranked up.
Compared with the relevant technologies, technical scheme includes: scan service access layer, and identification includes pre-set
The interface of automatic load information;It with recognize include automatic load according in preset cached parameters information loading of databases
The corresponding data of information interface simultaneously cache;The data of the business of access are and include the corresponding number of the interface of automatic load information
According to when, read the data of caching, realize business access processing;Cached parameters information includes: the data separation mark of default settings
Information, and/or caching conditional information, and/or expired time.The embodiment of the present invention passes through automatic load information and caching to data
Parameter information carries out caching process, and a large amount of concurrent operations lead to system snowslide when avoiding due to cache invalidation;Further, lead to
It crosses and the data of caching is ranked up and is cached with judgement processing, improve the utilization benefit of data and data cached reading in caching
Take efficiency.
After reading and understanding attached drawing and detailed description, it can be appreciated that other aspects.
Detailed description of the invention
Fig. 1 is the block diagram of the essential electrical structure of the server of the embodiment of the present invention;
Fig. 2 is the flow chart that the embodiment of the present invention loads data cached method;
Fig. 3 is the flow chart that another embodiment of the present invention loads data cached method;
Fig. 4 is the structural block diagram that the embodiment of the present invention loads data cached device;
Fig. 5 is the method flow diagram that the present invention applies example one;
Fig. 6 is the method flow diagram that the present invention applies example two.
Specific embodiment
To make the objectives, technical solutions, and advantages of the present invention clearer, below in conjunction with attached drawing to the present invention
Embodiment be described in detail.It should be noted that in the absence of conflict, in the embodiment and embodiment in the application
Feature can mutual any combination.
In subsequent description, it is only using the suffix for indicating such as " module ", " component " or " unit " of element
Be conducive to explanation of the invention, itself there is no specific meanings.Therefore, " module " can mixedly make with " component "
With.
As shown in Figure 1, the block diagram of the essential electrical structure for the server of the embodiment of the present invention, comprising: input and output
(IO) bus, processor 40, memory 41, memory 42 and communication device 43.Wherein,
Input and output (IO) bus respectively with itself belonging to server other components it is (processor 40, memory 41, interior
Deposit 42 and communication device 43) connection, and transmission lines are provided for other components.
Processor 40 usually controls the overall operation of the server belonging to itself.For example, processor 40 execute calculate and really
The operation such as recognize.Wherein, processor 40 can be central processing unit (CPU).
Communication device 43, generally includes one or more components, allows server and wireless communication system belonging to itself
Radio communication between system or network.
The software code that 41 storage processor of memory is 40 readable, processor is executable, it includes be used for control processor
40 execute the instruction (i.e. software execution function) of functions described herein.
Based on the electrical structure of above-mentioned server, the embodiment of the method for the present invention is proposed.
Fig. 2 is the flow chart that the embodiment of the present invention loads data cached method, as shown in Figure 2, comprising:
It should be noted that automatic load information can be automatic load label, mark or other can be by such interface
The similar information distinguished with other interfaces.Automatic load information can according to those skilled in the art to data whether need into
The analytical judgment that row loads automatically is configured.In addition, the operation of scan service layer is to have carried out usual behaviour to service access layer
Make.
Step 201 according in preset cached parameters information loading of databases with what is recognized includes that automatic load is believed
The corresponding data of breath interface simultaneously cache;
Here, cached parameters information include: default settings data separation identification information, and/or caching conditional information,
And/or expired time;Wherein, data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters
Conversion generation is carried out using preset method function.
When step 202, the data of the business of access are with including the interface of automatic load information corresponding data, read
The data of caching realize business access processing;
It should be noted that preset method function may include that Hash, Message Digest Algorithm 5 (MD5) etc. can be with
Generate the information of unique identification, coding or title;Data separation identification information may include: when being combined using interface parameters
By the parameter of interface according to the sort merge rule of setting, the corresponding data separation identification information of data of different cachings is generated,
The information that a combination thereof generates can be the similar sequence number information with library preservation books;Life is combined using interface parameters
The data separation identification information that can make to generate is generated using the progress budget of preset method function at using to interface parameters
With certain nomenclature rule, user personnel is facilitated to identify during to data cached be analyzed and processed;The present invention
Embodiment can also generate data separation identification information using other modes, as long as each data separation generated identifies letter
Breath has uniqueness.
Optionally, when the data of the business of access are with including the interface of automatic load information corresponding data, if
Read the data of the unread business to access when the data of caching, present invention method further include: right from database
It include that the data of the interface of automatic load information are loaded and cached corresponding to the data of the business of access.
It should be noted that here, the data of the unread business to access include: according to this when reading the data of caching
Mistake or failure occur when the data of the business of read access from the data of caching for inventive embodiments, read error in data or mistake
Losing can be implemented by the whether successful judgment method of reading data in the related technology.
Optionally, present invention method further include:
When data update, if update data be with include automatic load information interface on data,
Update the data corresponding with the business of update for including in caching.
It should be noted that here, data occur the data that update includes business and update, and the update of business datum can
It is determined with the system parameter being related to by system when being updated, such as configuration log, system log, running log
And the file content of other record data variations is judged.In addition, data-updating method may include using the relevant technologies
In the store paths of former data corresponding to the data of update be determined, by the data on the store path of determining former data
After deletion, the data of the business of update are loaded and are written the operation of caching.
Optionally, present invention method further include:
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information is handled according to the caching judgement that the data state info of reading carries out data;Here, preset taking-up strategy includes pressing
The data state info that sequence according to message queue is read the data one by one.
Data state info is gathered in advance, the information including at least one of:
The data of caching in previous request time, and/or,
The load number of the data of caching and load each time caching data time, and/or,
The data of caching are accessed number in preset duration.
It should be noted that the content that data state info includes can be according to system cache flow, business access speed
Etc. the reference being made whether as data state info, those skilled in the art can be with usage scenario according to an embodiment of the present invention
It is added and deletes.
Optionally, the caching judgement for carrying out data, which is handled, includes:
Data state info includes the data of caching in previous request time,
The difference in previous request time of the data of current time in system and caching is greater than preset requesting interval
When threshold value, determine caching data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache;With/
Or,
When data state info includes the load number of the data of caching and loads the time of data of caching each time,
If the load number of the data of caching is greater than the load of the data of preset load frequency threshold value, and/or caching
Between be greater than preset load time threshold value, determine that the data of caching are hot spot but time controllable data, will determine as hot spot but
Time controllable data are removed from the cache;And/or
When data state info includes accessed number of the data of caching in preset duration,
Accessed number of the data of caching in preset duration is less than preset access times threshold value, determines that data are non-
Hotspot caching data will determine as non-hot data cached be removed from the cache.
It can be by the total of the number of the data of load caching it should be noted that loading the data cached time each time
The number of time and load is calculated.In addition, the data of caching access times in preset duration are less than preset access time
The judgement of number threshold value may include: the time by recording the data of access cache for the first time, will subtract for the first time the current time in system
The time of the data of access cache obtains access duration, and statistics access duration reaches the accessed of the data cached when preset duration
Number, such as the accessed number by accessing the data cached in beginning the latter hour for the first time, if accessed number is small
In access times threshold value, such as access times threshold value is 60, determine caching data be it is non-hot data cached, will determine as non-
Hotspot caching data are removed from the cache;If not the timing since the time of the data of the caching accessed for the first time, then according to
The accessed number of the data of preset duration statistics caching.
In addition, reading cache data information can use multiple task parallelisms from message queue in the embodiment of the present invention
Mode carry out, with this can be improved caching judgement processing speed.
Optionally, when cached parameters information includes expired time, present invention method further include:
The expired time of the data of caching is obtained, and from database load and data cached handling duration, it will be expired
Time subtracts handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, mistake in simultaneously cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
Optionally, present invention method further include:
To the data of caching, according to expired time, and/or the request frequency for loading time-consuming duration, and/or data to caching
Data be ranked up.
It is obtained it should be noted that sort method analyze and determine by those skilled in the art according to system requirements
, if system requirements request response efficiency is fast, the high data of request frequency can be buffered in the preceding position of sequence, be convenient for
It is efficient to read;If loaded, time-consuming duration is longer to will affect the data for reading other business, and it is longer to load time-consuming duration
Data can be buffered in the subsequent position of sequence, avoid the reading to the data for influencing other cachings.
Present invention method carries out caching process by automatic load information and caching parameter information to data, avoids
A large amount of concurrent operations lead to system snowslide when due to cache invalidation;Further, be ranked up by the data to caching and
Judgement processing is cached, the utilization benefit and data cached reading efficiency of data in caching are improved.
Fig. 3 is the flow chart that another embodiment of the present invention loads data cached method, as shown in Figure 3, comprising:
Step 300, scan service access layer, identification include the interface of pre-set automatic load information;
It should be noted that automatic load information can be automatic load label, mark or other can be by such interface
The similar information distinguished with other interfaces.Automatic load information can according to those skilled in the art to data whether need into
The analytical judgment that row loads automatically is configured.In addition, the operation of scan service layer is to have carried out usual behaviour to service access layer
Make.
Step 301 according in preset cached parameters information loading of databases with what is recognized includes that automatic load is believed
The corresponding data of breath interface simultaneously cache;
Here, cached parameters information include: default settings data separation identification information, and/or caching conditional information,
And/or expired time;Wherein, data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters
Conversion generation is carried out using preset method function.
The message queue of the data state info of the data of step 302, foundation comprising all cachings, according to preset taking-up
Strategy reads the data state info of one or more data from the message queue of foundation, according to the data shape of reading
State information carries out the caching judgement processing of data;
Here, data state info is gathered in advance, the information including at least one of:
The data of caching in previous request time, and/or,
The load number of the data of caching and load each time caching data time, and/or,
The data of caching are accessed number in preset duration.
It should be noted that the content that data state info includes can be according to system cache flow, business access speed
Etc. the reference being made whether as data state info, those skilled in the art can be with usage scenario according to an embodiment of the present invention
It is added and deletes.
Optionally, the caching judgement for carrying out data, which is handled, includes:
Data state info includes the data of caching in previous request time,
The difference in previous request time of the data of current time in system and caching is greater than preset requesting interval
When threshold value, determine caching data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache;With/
Or,
When data state info includes the load number of the data of caching and loads the time of data of caching each time,
If the load number of the data of caching is greater than the load of the data of preset load frequency threshold value, and/or caching
Between be greater than preset load time threshold value, determine that the data of caching are hot spot but time controllable data, will determine as hot spot but
Time controllable data are removed from the cache;And/or
When data state info includes accessed number of the data of caching in preset duration,
Accessed number of the data of caching in preset duration is less than preset access times threshold value, determines that data are non-
Hotspot caching data will determine as non-hot data cached be removed from the cache.
It can be by the total of the number of the data of load caching it should be noted that loading the data cached time each time
The number of time and load is calculated.In addition, the data of caching access times in preset duration are less than preset access time
The judgement of number threshold value may include: the time by recording the data of access cache for the first time, will subtract for the first time the current time in system
The time of the data of access cache obtains access duration, and statistics access duration reaches the accessed of the data cached when preset duration
Number, such as the accessed number by accessing the data cached in beginning the latter hour for the first time, if accessed number is small
In access times threshold value, such as access times threshold value is 60, determine caching data be it is non-hot data cached, will determine as non-
Hotspot caching data are removed from the cache;If not the timing since the time of the data of the caching accessed for the first time, then according to
The accessed number of the data of preset duration statistics caching.
In addition, reading cache data information can use multiple task parallelisms from message queue in the embodiment of the present invention
Mode carry out, with this can be improved caching judgement processing speed.
Step 303, the data to caching, according to the request frequency of expired time, and/or the time-consuming duration of load, and/or data
Rate is ranked up the data of caching.
It is obtained it should be noted that sort method analyze and determine by those skilled in the art according to system requirements
, if system requirements request response efficiency is fast, the high data of request frequency can be buffered in the preceding position of sequence, be convenient for
It is efficient to read;If loaded, time-consuming duration is longer to will affect the data for reading other business, and it is longer to load time-consuming duration
Data can be buffered in the subsequent position of sequence, avoid the reading to the data for influencing other cachings.
When step 304, the data of the business of access are with including the interface of automatic load information corresponding data, read
The data of caching realize business access processing;
It should be noted that preset method function may include that Hash, Message Digest Algorithm 5 (MD5) etc. can be with
Generate the information of unique identification, coding or title;Data separation identification information may include: when being combined using interface parameters
By the parameter of interface according to the sort merge rule of setting, the corresponding data separation identification information of data of different cachings is generated,
The information that a combination thereof generates can be the similar sequence number information with library preservation books;Life is combined using interface parameters
The data separation identification information that can make to generate is generated using the progress budget of preset method function at using to interface parameters
With certain nomenclature rule, user personnel is facilitated to identify during to data cached be analyzed and processed;The present invention
Embodiment can also generate data separation identification information using other modes, as long as each data separation generated identifies letter
Breath has uniqueness.
When step 305, the data of the business of access are with including the interface of automatic load information corresponding data, if
The data for reading the unread business to access when the data of caching, corresponding to the data from database to the business of access
It include that the data of the interface of automatic load information are loaded and cached.
It should be noted that here, the data of the unread business to access include: according to this when reading the data of caching
Mistake or failure occur when the data of the business of read access from the data of caching for inventive embodiments, read error in data or mistake
Losing can be implemented by the whether successful judgment method of reading data in the related technology.
Optionally, when cached parameters information includes expired time, present invention method further include:
The expired time of the data of caching is obtained, and from database load and data cached handling duration, it will be expired
Time subtracts handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, mistake in simultaneously cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
Optionally, present invention method further include:
When data update, if update data be with include automatic load information interface on data,
Update the data corresponding with the business of update for including in caching.
It should be noted that here, data occur the data that update includes business and update, and the update of business datum can
It is determined with the system parameter being related to by system when being updated, such as configuration log, system log, running log
And the file content of other record data variations is judged.In addition, data-updating method may include using the relevant technologies
In the store paths of former data corresponding to the data of update be determined, by the data on the store path of determining former data
After deletion, the data of the business of update are loaded and are written the operation of caching.
Fig. 4 is the structural block diagram that the embodiment of the present invention loads data cached device, as shown in Figure 4, comprising: identification is single
Member, cache unit and reading unit;Wherein,
Recognition unit is used for, scan service access layer, and identification includes the interface of pre-set automatic load information;
Cache unit is used for, according in preset cached parameters information loading of databases with recognize include it is automatic plus
It carries the corresponding data of information interface and caches;
Optionally, cache unit is also used to, and the data of the business of access are and the interface pair that includes automatic load information
When the data answered, if the data of the unread business to access when the data of caching are read, to the industry of access from database
It include that the data of the interface of automatic load information are loaded and cached corresponding to the data of business.
Here, when reading the data of caching it is unread to access business data include: according to the embodiment of the present invention from
Mistake or failure occurs in the data of caching when the data of the business of read access, phase can be passed through by reading error in data or failure
The whether successful judgment method of reading data is implemented in the technology of pass.
Optionally, cache unit is also used to,
When cached parameters information includes expired time, the expired time of the data of caching is obtained, and load from database
With data cached handling duration, expired time is subtracted into handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, mistake in simultaneously cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
Reading unit is used for, and the data of the business of access are and include the corresponding data of the interface of automatic load information
When, the data of caching are read, realize business access processing;
Cached parameters information includes: the data separation identification information, and/or caching conditional information, and/or mistake of default settings
Time phase;
Data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters using preset side
Letter of law number carries out conversion generation.
It should be noted that automatic load information can be automatic load label, mark or other can be by such interface
The similar information distinguished with other interfaces.Automatic load information can according to those skilled in the art to data whether need into
The analytical judgment that row loads automatically is configured.In addition, preset method function may include Hash, Message Digest 5 the 5th
The information of unique identification, coding or title can be generated in version (MD5) etc.;Data separation identification information carries out group using interface parameters
It may include: the sort merge rule by the parameter of interface according to setting when conjunction, generate the corresponding data of data of different cachings
Distinguishing identifier information, the information that a combination thereof generates can be the similar sequence number information with library preservation books;Using interface
Parameter is combined generation and generates the number that can make to generate using budget is carried out using preset method function to interface parameters
According to distinguishing identifier information have certain nomenclature rule, facilitate user personnel during to data cached be analyzed and processed into
Row identification;The embodiment of the present invention can also generate data separation identification information using other modes, as long as each generated
Data separation identification information has uniqueness.
Optionally, the device of that embodiment of the invention further includes updating unit,
Updating unit is used for, when data update, if update data be with include connecing for automatic load information
Data on mouth then update the data corresponding with the business of update for including in caching.
It should be noted that here, data occur the data that update includes business and update, and the update of business datum can
It is determined with the system parameter being related to by system when being updated, such as configuration log, system log, running log
And the file content of other record data variations is judged.In addition, data-updating method may include using the relevant technologies
In the store paths of former data corresponding to the data of update be determined, by the data on the store path of determining former data
After deletion, the data of the business of update are loaded and are written the operation of caching.
Optionally, the device of that embodiment of the invention further includes caching process unit, for establishing the data comprising all cachings
Data state info message queue;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information is handled according to the caching judgement that the data state info of reading carries out data;
Data state info is gathered in advance, the information including at least one of:
The data of caching in previous request time, and/or,
The load number of the data of caching and load each time caching data time, and/or,
The data of caching are accessed number in preset duration.
It should be noted that the content that data state info includes can be according to system cache flow, business access speed
Etc. the reference being made whether as data state info, those skilled in the art can be with usage scenario according to an embodiment of the present invention
It is added and deletes.
Optionally, caching process unit is specifically used for,
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information;
The data state info of reading includes the data of caching in previous request time,
The difference in previous request time of the data of current time in system and caching is greater than preset requesting interval
When threshold value, determine caching data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache;With/
Or,
The data state info of reading include caching data load number and load each time caching data when
Between when,
If the load number of the data of caching is greater than the load of the data of preset load frequency threshold value, and/or caching
Between be greater than preset load time threshold value, determine that the data of caching are hot spot but time controllable data, will determine as hot spot but
Time controllable data are removed from the cache;And/or
When the data state info of reading includes accessed number of the data of caching in preset duration,
Accessed number of the data of caching in preset duration is less than preset access times threshold value, determines that data are non-
Hotspot caching data will determine as non-hot data cached be removed from the cache.
It can be by the total of the number of the data of load caching it should be noted that loading the data cached time each time
The number of time and load is calculated.In addition, the data of caching access times in preset duration are less than preset access time
The judgement of number threshold value may include: the time by recording the data of access cache for the first time, will subtract for the first time the current time in system
The time of the data of access cache obtains access duration, and statistics access duration reaches the accessed of the data cached when preset duration
Number, such as the accessed number by accessing the data cached in beginning the latter hour for the first time, if accessed number is small
In access times threshold value, such as access times threshold value is 60, determine caching data be it is non-hot data cached, will determine as non-
Hotspot caching data are removed from the cache;If not the timing since the time of the data of the caching accessed for the first time, then according to
The accessed number of the data of preset duration statistics caching.
In addition, reading cache data information can use multiple task parallelisms from message queue in the embodiment of the present invention
Mode carry out, with this can be improved caching judgement processing speed.
The device of that embodiment of the invention further includes sequencing unit,
Sequencing unit is used for, and to the data of caching, according to expired time, and/or loads time-consuming duration, and/or data
Request frequency is ranked up the data of caching.
It is obtained it should be noted that sort method analyze and determine by those skilled in the art according to system requirements
, if system requirements request response efficiency is fast, the high data of request frequency can be buffered in the preceding position of sequence, be convenient for
It is efficient to read;If loaded, time-consuming duration is longer to will affect the data for reading other business, and it is longer to load time-consuming duration
Data can be buffered in the subsequent position of sequence, avoid the reading to the data for influencing other cachings.
The device of that embodiment of the invention can be set to work on the server, can also be by being communicated with server
It works after connection.
Another embodiment of the present invention loads data cached device, comprising: recognition unit, cache unit, reading unit, more
New unit, caching process unit and sequencing unit;Wherein,
Recognition unit is used for, scan service access layer, and identification includes the interface of pre-set automatic load information;
Cache unit is used for, according in preset cached parameters information loading of databases with recognize include it is automatic plus
It carries the corresponding data of information interface and caches;
Reading unit is used for, and the data of the business of access are and include the corresponding data of the interface of automatic load information
When, the data of caching are read, realize business access processing;
Cached parameters information includes: the data separation identification information, and/or caching conditional information, and/or mistake of default settings
Time phase;
Data separation identification information includes: to be combined according to interface parameters, and/or to interface parameters using preset side
Letter of law number carries out conversion generation.
It should be noted that automatic load information can be automatic load label, mark or other can be by such interface
The similar information distinguished with other interfaces.Automatic load information can according to those skilled in the art to data whether need into
The analytical judgment that row loads automatically is configured.In addition, preset method function may include Hash, Message Digest 5 the 5th
The information of unique identification, coding or title can be generated in version (MD5) etc.;Data separation identification information carries out group using interface parameters
It may include: the sort merge rule by the parameter of interface according to setting when conjunction, generate the corresponding data of data of different cachings
Distinguishing identifier information, the information that a combination thereof generates can be the similar sequence number information with library preservation books;Using interface
Parameter is combined generation and generates the number that can make to generate using budget is carried out using preset method function to interface parameters
According to distinguishing identifier information have certain nomenclature rule, facilitate user personnel during to data cached be analyzed and processed into
Row identification;The embodiment of the present invention can also generate data separation identification information using other modes, as long as each generated
Data separation identification information has uniqueness.
Updating unit is used for, when data update, if update data be with include connecing for automatic load information
Data on mouth then update the data corresponding with the business of update for including in caching.
It should be noted that here, data occur the data that update includes business and update, and the update of business datum can
It is determined with the system parameter being related to by system when being updated, such as configuration log, system log, running log
And the file content of other record data variations is judged.In addition, data-updating method may include using the relevant technologies
In the store paths of former data corresponding to the data of update be determined, by the data on the store path of determining former data
After deletion, the data of the business of update are loaded and are written the operation of caching.
Caching process unit, the message queue of the data state info for establishing the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information is handled according to the caching judgement that the data state info of reading carries out data;
Data state info is gathered in advance, the information including at least one of:
The data of caching in previous request time, and/or,
The load number of the data of caching and load each time caching data time, and/or,
The data of caching are accessed number in preset duration.
It should be noted that the content that data state info includes can be according to system cache flow, business access speed
Etc. the reference being made whether as data state info, those skilled in the art can be with usage scenario according to an embodiment of the present invention
It is added and deletes.
Optionally, caching process unit is specifically used for,
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information;
The data state info of reading includes the data of caching in previous request time,
The difference in previous request time of the data of current time in system and caching is greater than preset requesting interval
When threshold value, determine caching data be it is non-hot data cached, will determine as non-hot data cached be removed from the cache;With/
Or,
The data state info of reading include caching data load number and load each time caching data when
Between when,
If the load number of the data of caching is greater than the load of the data of preset load frequency threshold value, and/or caching
Between be greater than preset load time threshold value, determine that the data of caching are hot spot but time controllable data, will determine as hot spot but
Time controllable data are removed from the cache;And/or
When the data state info of reading includes accessed number of the data of caching in preset duration,
Accessed number of the data of caching in preset duration is less than preset access times threshold value, determines that data are non-
Hotspot caching data will determine as non-hot data cached be removed from the cache.
It can be by the total of the number of the data of load caching it should be noted that loading the data cached time each time
The number of time and load is calculated.In addition, the data of caching access times in preset duration are less than preset access time
The judgement of number threshold value may include: the time by recording the data of access cache for the first time, will subtract for the first time the current time in system
The time of the data of access cache obtains access duration, and statistics access duration reaches the accessed of the data cached when preset duration
Number, such as the accessed number by accessing the data cached in beginning the latter hour for the first time, if accessed number is small
In access times threshold value, such as access times threshold value is 60, determine caching data be it is non-hot data cached, will determine as non-
Hotspot caching data are removed from the cache;If not the timing since the time of the data of the caching accessed for the first time, then according to
The accessed number of the data of preset duration statistics caching.
In addition, reading cache data information can use multiple task parallelisms from message queue in the embodiment of the present invention
Mode carry out, with this can be improved caching judgement processing speed.
Sequencing unit is used for, and to the data of caching, according to expired time, and/or loads time-consuming duration, and/or data
Request frequency is ranked up the data of caching.
It is obtained it should be noted that sort method analyze and determine by those skilled in the art according to system requirements
, if system requirements request response efficiency is fast, the high data of request frequency can be buffered in the preceding position of sequence, be convenient for
It is efficient to read;If loaded, time-consuming duration is longer to will affect the data for reading other business, and it is longer to load time-consuming duration
Data can be buffered in the subsequent position of sequence, avoid the reading to the data for influencing other cachings.
Optionally, cache unit is also used to, and the data of the business of access are and the interface pair that includes automatic load information
When the data answered, if the data of the unread business to access when the data of caching are read, to the industry of access from database
It include that the data of the interface of automatic load information are loaded and cached corresponding to the data of business.
Here, when reading the data of caching it is unread to access business data include: according to the embodiment of the present invention from
Mistake or failure occurs in the data of caching when the data of the business of read access, phase can be passed through by reading error in data or failure
The whether successful judgment method of reading data is implemented in the technology of pass.
Optionally, cache unit is also used to,
When cached parameters information includes expired time, the expired time of the data of caching is obtained, and load from database
With data cached handling duration, expired time is subtracted into handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, mistake in simultaneously cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
The device of that embodiment of the invention can be set to work on the server, can also be by being communicated with server
It works after connection.
The method of the present invention is carried out to understand detailed description below by way of using example, is only used for stating this hair using example
Bright embodiment, is not intended to limit the scope of protection of the present invention.
Using example 1
Fig. 5 is the method flow diagram that the present invention applies example one, as shown in Figure 5, comprising:
Step 501 according in preset cached parameters information loading of databases with what is recognized includes that automatic load is believed
The corresponding data of breath interface simultaneously cache;Here, the data of caching can be written into caching center;
This application example, cached parameters information include: the data separation identification information, and/or caching condition of default settings
Information, and/or expired time;Wherein, data separation identification information includes: to be combined according to interface parameters, and/or to interface
Parameter carries out conversion generation using preset method function.
When step 502, the business datum of access are with including the interface of automatic load information corresponding data, judgement is
The no data for reading caching;If reading the data of caching, step 5030 is executed;If the unread data to caching,
Execute step 5040;
It include the interface of automatic load information corresponding to step 5040, the data from database to the business of access
Data loaded and cached;After executing step 5040, business access can be realized according to the data of caching using example
Processing, it can execute step 5030.
This application example can be with the result set of returned data using example after the caching for completing data.
Processing business and updates data, which may include back-stage management, has also been made modification etc. to product details, for example, configuration parameter modification,
Product type modification etc.;
It should be noted that the data of update can be synchronously written into according to the processing method of the relevant technologies when buffer update
It is updated accordingly in database.
Using example 2
Message queue of this application example firstly the need of the data state info for establishing the data comprising all cachings;Data
Status information is gathered in advance, comprising: the load number of the data in previous request time, caching of the data of caching
And load each time the data of caching time, and/or caching data in preset duration be accessed number.This application example
It is illustrated by taking the first item data state info that a process is read every time in message queue as an example;Fig. 6 is present invention application
The method flow diagram of example two, as shown in Figure 6, comprising:
Step 600, first data state info that current queue is read from message queue;
Step 601, obtained from data state info caching data in previous request time, read current system
It unites the time;
It should be noted that the information such as request time and present system time can be using acquisition method in the related technology
Implemented.
The difference in previous request time of the data of step 602, current time in system and caching is greater than preset
When requesting interval threshold value, determine that the data of caching are non-hot data cached;
Requesting interval threshold value can carry out analysis setting according to parameters such as business access timeliness demand and system performances.
Step 603 will determine as non-hot data cached be removed from the cache;
Step 604, the accessed number in preset duration from the data for obtaining caching in data state info;
The accessed number of step 605, the data of caching in preset duration is less than preset access times threshold value, determines
Data are non-hot data cached;It data cached is handled according to step 603 to non-hot.
This application example preset duration can be 1 hour, access times threshold value 60 times;
Step 606, obtained from data state info caching data load number and load the number of caching each time
According to time;
If the load number of step 607, the data of caching is greater than the data of preset load frequency threshold value, and/or caching
Load time be greater than preset load time threshold value, determine that the data of caching are hot spot but time controllable data;
This application example loads frequency threshold value can be with 100, and load time threshold value may include 10~100 milliseconds;Actual number
Value can be determined according to number of request;
Step 608 will determine as hot spot but time controllable data are removed from the cache;
It should be noted that data state info is updated according to the content for executing step when executing above-mentioned steps.
This application example further include: to the data of caching, according to expired time, and/or load time-consuming duration, and/or number
According to request frequency the data of caching are ranked up.
It should be noted that sequence process can be carried out by starting individual process, according to the strategy of setting to message
Queue is ranked up, and sort algorithm may include: closer to expired time, and/or more time-consuming data cached, before sequence more.
Bit-reversed is carried out according to request number of times, request number of times is more, illustrates that frequency of use is higher, causes concurrent possibility bigger.
Those of ordinary skill in the art will appreciate that all or part of the steps in the above method can be instructed by program
Related hardware (such as processor) is completed, and described program can store in computer readable storage medium, as read-only memory,
Disk or CD etc..Optionally, one or more integrated circuits also can be used in all or part of the steps of above-described embodiment
It realizes.Correspondingly, each module/unit in above-described embodiment can take the form of hardware realization, such as pass through integrated electricity
Its corresponding function is realized on road, can also be realized in the form of software function module, such as is stored in by processor execution
Program/instruction in memory realizes its corresponding function.The present invention is not limited to the hardware and softwares of any particular form
In conjunction with.".
Although disclosed herein embodiment it is as above, the content only for ease of understanding the present invention and use
Embodiment is not intended to limit the invention.Technical staff in any fields of the present invention is taken off not departing from the present invention
Under the premise of the spirit and scope of dew, any modification and variation, but the present invention can be carried out in the form and details of implementation
Scope of patent protection, still should be subject to the scope of the claims as defined in the appended claims.
Claims (9)
1. a kind of load data cached device characterized by comprising recognition unit, cache unit and reading unit;Its
In,
Recognition unit is used for, scan service access layer, and identification includes the interface of pre-set automatic load information;Wherein,
The automatic load information is used to determine whether to load data automatically;
Cache unit is used for, and with what is recognized includes that automatic load is believed according in preset cached parameters information loading of databases
The corresponding data of breath interface simultaneously cache;
Reading unit is used for, and when the data of the business of access are with including the interface of automatic load information corresponding data, is read
The data of caching are taken, realize business access processing;
The cached parameters information includes expired time;
The cache unit is also used to,
When cached parameters information includes expired time, the expired time of the data of the caching is obtained, and load from database
With the handling duration for caching the data, expired time is subtracted into handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, simultaneously mistake described in cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
2. the apparatus according to claim 1, which is characterized in that the cache unit is also used to, the business of the access
When data are with including the interface of automatic load information corresponding data, if read unread to described when the data of caching
The data of the business of access include the interface of automatic load information corresponding to the data from database to the business of access
Data loaded and cached.
3. the apparatus according to claim 1, which is characterized in that described device further includes updating unit,
Updating unit is used for, when data update, if update data be with include connecing for the automatic load information
The corresponding data of mouth, then update the data corresponding with the business of update for including in caching.
4. described in any item devices according to claim 1~3, which is characterized in that described device further includes caching process unit,
For establishing the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information is handled according to the caching judgement that the data state info of reading carries out data;
The data state info is gathered in advance, the information including at least one of:
The data of the caching in previous request time, and/or,
The load number of the data of the caching and load each time the caching data time, and/or,
The data of the caching are accessed number in preset duration.
5. device according to claim 4, which is characterized in that the caching process unit is specifically used for,
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information;
When the data state info of reading includes the data of the caching in previous request time,
And when the difference in previous request time of current time in system and the data of the caching is greater than preset
When requesting interval threshold value, determine caching the data be it is non-hot data cached, will determine as non-hot data cached number
According to being removed from the cache;And/or
When the data state info of reading includes the load number of the data of the caching and each time loads the caching
Data time when,
If the load number of the data of the caching is greater than adding for the data of preset load frequency threshold value, and/or the caching
Carrying the time is greater than preset load time threshold value, determines that the data of the caching are hot spot but time controllable data, will determine
For hot spot but time controllable data are removed from the cache;And/or
When the data state info of reading includes accessed number of the data of the caching in preset duration,
If accessed number of the data of the caching in the preset duration is less than preset access times threshold value, institute is determined
State data be it is non-hot data cached, will determine as non-hot data cached data and be removed from the cache.
6. described in any item devices according to claim 1~3, which is characterized in that described device further includes sequencing unit,
Sequencing unit is used for, and to the data of the caching, according to expired time, and/or loads time-consuming duration, and/or data
Request frequency is ranked up the data of caching.
7. a kind of load data cached method characterized by comprising
Scan service access layer, identification include the interface of pre-set automatic load information;
It with what is recognized include that automatic load information interface is corresponding according in preset cached parameters information loading of databases
Data simultaneously cache;Wherein, the automatic load information is used to determine whether to load data automatically;
When the data of the business of access are with including the interface of automatic load information corresponding data, the data of caching are read,
Realize business access processing;
The cached parameters information includes expired time;
The expired time of the data of the caching is obtained, and loads and cache the handling duration of the data from database, it will
Expired time subtracts handling duration and obtains the advanced processing time;
In the advanced processing time of acquisition, if not carrying out the data load of business, simultaneously mistake described in cache database is loaded
The data of the data cached corresponding update reached when time phase reaches with expired time, and cache.
8. the method according to the description of claim 7 is characterized in that the method also includes:
Establish the message queue of the data state info of the data comprising all cachings;
According to preset taking-up strategy, the data mode of one or more data is read from the message queue of foundation
Information is handled according to the caching judgement that the data state info of reading carries out data;
The data state info is gathered in advance, the information including at least one of:
The data of the caching in previous request time, and/or,
The load number of the data of the caching and load each time the caching data time, and/or,
The data of the caching are accessed number in preset duration.
9. according to the method described in claim 8, it is characterized in that, the caching judgement processing for carrying out data includes:
When the data state info includes the data of the caching in previous request time,
And when the difference in previous request time of current time in system and the data of the caching is greater than preset
When requesting interval threshold value, determine caching the data be it is non-hot data cached, will determine as non-hot data cached number
According to being removed from the cache;And/or
When the data state info includes the load number of the data of the caching and each time loads the data of the caching
Time when,
If the load number of the data of the caching is greater than adding for the data of preset load frequency threshold value, and/or the caching
Carrying the time is greater than preset load time threshold value, determines that the data of the caching are hot spot but time controllable data, will determine
For hot spot but time controllable data are removed from the cache;And/or
When the data state info includes accessed number of the data of the caching in preset duration,
If accessed number of the data of the caching in the preset duration is less than preset access times threshold value, institute is determined
State data be it is non-hot data cached, will determine as non-hot data cached data and be removed from the cache.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610324104.7A CN106021445B (en) | 2016-05-16 | 2016-05-16 | It is a kind of to load data cached method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610324104.7A CN106021445B (en) | 2016-05-16 | 2016-05-16 | It is a kind of to load data cached method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106021445A CN106021445A (en) | 2016-10-12 |
CN106021445B true CN106021445B (en) | 2019-10-15 |
Family
ID=57097977
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610324104.7A Active CN106021445B (en) | 2016-05-16 | 2016-05-16 | It is a kind of to load data cached method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106021445B (en) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106815287A (en) * | 2016-12-06 | 2017-06-09 | 中国银联股份有限公司 | A kind of buffer memory management method and device |
CN106843769B (en) * | 2017-01-23 | 2019-08-02 | 北京齐尔布莱特科技有限公司 | A kind of interface data caching method, device and calculate equipment |
CN106874124B (en) * | 2017-03-30 | 2023-04-14 | 光一科技股份有限公司 | SQLite rapid loading technology-based object-oriented electricity utilization information acquisition terminal |
CN107463598A (en) * | 2017-06-09 | 2017-12-12 | 中国邮政储蓄银行股份有限公司 | Distributed cache system |
CN107665235B (en) * | 2017-07-27 | 2020-06-30 | 深圳壹账通智能科技有限公司 | Cache processing method and device, computer equipment and storage medium |
CN108829743A (en) * | 2018-05-24 | 2018-11-16 | 平安科技(深圳)有限公司 | Data cached update method, device, computer equipment and storage medium |
CN110555744A (en) * | 2018-05-31 | 2019-12-10 | 阿里巴巴集团控股有限公司 | Service data processing method and system |
CN110895474A (en) * | 2018-08-24 | 2020-03-20 | 深圳市鸿合创新信息技术有限责任公司 | Terminal micro-service device and method and electronic equipment |
CN109597915B (en) * | 2018-09-18 | 2022-03-01 | 北京微播视界科技有限公司 | Access request processing method and device |
CN109471875B (en) * | 2018-09-25 | 2021-08-20 | 网宿科技股份有限公司 | Hot degree management method based on cache data, server and storage medium |
CN110109956B (en) * | 2019-03-21 | 2021-10-01 | 福建天泉教育科技有限公司 | Method and terminal for preventing cache from penetrating |
CN111984889A (en) * | 2020-02-21 | 2020-11-24 | 广东三维家信息科技有限公司 | Caching method and system |
CN111736769B (en) * | 2020-06-05 | 2022-07-26 | 苏州浪潮智能科技有限公司 | Method, device and medium for diluting cache space |
CN112115074A (en) * | 2020-09-02 | 2020-12-22 | 紫光云(南京)数字技术有限公司 | Method for realizing data resident memory by using automatic loading mechanism |
CN112559572A (en) * | 2020-12-22 | 2021-03-26 | 上海悦易网络信息技术有限公司 | Method and equipment for preheating data cache of Key-Value cache system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102170479A (en) * | 2011-05-21 | 2011-08-31 | 成都市华为赛门铁克科技有限公司 | Updating method of Web buffer and updating device of Web buffer |
CN103488581A (en) * | 2013-09-04 | 2014-01-01 | 用友软件股份有限公司 | Data caching system and data caching method |
WO2014123127A1 (en) * | 2013-02-06 | 2014-08-14 | Square Enix Holdings Co., Ltd. | Image processing apparatus, method of controlling the same, program and storage medium |
CN105302493A (en) * | 2015-11-19 | 2016-02-03 | 浪潮(北京)电子信息产业有限公司 | Swap-in and swap-out control method and system for SSD cache in mixed storage array |
-
2016
- 2016-05-16 CN CN201610324104.7A patent/CN106021445B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102170479A (en) * | 2011-05-21 | 2011-08-31 | 成都市华为赛门铁克科技有限公司 | Updating method of Web buffer and updating device of Web buffer |
WO2014123127A1 (en) * | 2013-02-06 | 2014-08-14 | Square Enix Holdings Co., Ltd. | Image processing apparatus, method of controlling the same, program and storage medium |
CN103488581A (en) * | 2013-09-04 | 2014-01-01 | 用友软件股份有限公司 | Data caching system and data caching method |
CN105302493A (en) * | 2015-11-19 | 2016-02-03 | 浪潮(北京)电子信息产业有限公司 | Swap-in and swap-out control method and system for SSD cache in mixed storage array |
Also Published As
Publication number | Publication date |
---|---|
CN106021445A (en) | 2016-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106021445B (en) | It is a kind of to load data cached method and device | |
CN103856567B (en) | Small file storage method based on Hadoop distributed file system | |
US8799409B2 (en) | Server side data cache system | |
CN100447744C (en) | Method and system for managing stack | |
WO2014155553A1 (en) | Information processing method for distributed processing, information processing device and program, and distributed processing system | |
WO2016115957A1 (en) | Method and device for accelerating computers and intelligent devices for users and applications | |
CN103729247B (en) | Data acquisition request processing method and system and server | |
CN103186622B (en) | The update method of index information and device in a kind of text retrieval system | |
CN104320448B (en) | A kind of caching of the calculating equipment based on big data and prefetch acceleration method and device | |
CN104657435B (en) | A kind of memory management method and Network Management System using data | |
CN103812894B (en) | The management method of web file publishings version in a kind of real-time monitoring system | |
CN107329910A (en) | A kind of web front end data based on localStorage are locally stored and access method | |
CN106021566A (en) | Method, device and system for improving concurrent processing capacity of single database | |
US11093496B1 (en) | Performance-based query plan caching | |
US11507277B2 (en) | Key value store using progress verification | |
CN110096334A (en) | Method for caching and processing, device, equipment and computer readable storage medium | |
CN110109958A (en) | Method for caching and processing, device, equipment and computer readable storage medium | |
CN115757495A (en) | Cache data processing method and device, computer equipment and storage medium | |
CN109842621A (en) | A kind of method and terminal reducing token storage quantity | |
CN104461929B (en) | Distributed data cache method based on blocker | |
CN103136215A (en) | Data read-write method and device of storage system | |
CN111291022B (en) | Data storage system based on block chain | |
CN105812189B (en) | A kind of information processing method and server | |
CN110287152A (en) | A kind of method and relevant apparatus of data management | |
CN110059096A (en) | Data version management method, apparatus, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |