CN112367402A - Intelligent cache strategy storage method, device and equipment for real-time data application - Google Patents

Intelligent cache strategy storage method, device and equipment for real-time data application Download PDF

Info

Publication number
CN112367402A
CN112367402A CN202011271600.3A CN202011271600A CN112367402A CN 112367402 A CN112367402 A CN 112367402A CN 202011271600 A CN202011271600 A CN 202011271600A CN 112367402 A CN112367402 A CN 112367402A
Authority
CN
China
Prior art keywords
cache
strategy
policy
current
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011271600.3A
Other languages
Chinese (zh)
Inventor
朱承高
徐林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinan Huaxin Suangu Information Technology Co ltd
Original Assignee
Jinan Huaxin Suangu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinan Huaxin Suangu Information Technology Co ltd filed Critical Jinan Huaxin Suangu Information Technology Co ltd
Priority to CN202011271600.3A priority Critical patent/CN112367402A/en
Publication of CN112367402A publication Critical patent/CN112367402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching
    • H04L67/5682Policies or rules for updating, deleting or replacing the stored data

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The invention discloses an intelligent cache strategy storage method, device and equipment for real-time data application, and relates to the field of storage. The method comprises the following steps: executing the current cache strategy, and periodically acquiring the cache performance parameters of the current cache strategy; when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time, acquiring the data index values of all data stored in the current cache; when the current cache strategy is executed, simulating and operating a candidate cache strategy in the cache strategy set according to the data index value, and acquiring the cache performance parameters of the candidate cache strategy while periodically acquiring the cache performance parameters of the current cache strategy; and if the cache performance parameter of at least one candidate cache strategy is larger than the corresponding preset evaluation value, determining a replacement cache strategy in the at least one candidate cache strategy, and updating the currently executed cache strategy into the replacement cache strategy.

Description

Intelligent cache strategy storage method, device and equipment for real-time data application
Technical Field
The invention relates to the field of storage, in particular to an intelligent cache strategy storage method, device and equipment for real-time data application.
Background
Computer systems are composed of multiple components, each of which has widely varying hardware capabilities and IO requirements. In order to solve the problem of data transmission between hardware devices with different IO performances, a cache is designed and generated, and data is kept in a faster storage medium through pre-reading so as to reduce the influence of IO bottleneck on a system and improve the overall performance of the system. The core problem is to select which data are stored in the cache, and when the cache is full/close to full load, which data should be deleted from the cache, so as to improve the hit rate of the cache content. Because the hit rate of the cache directly affects the performance of the system, in order to improve the hit rate of the cache, a series of cache strategies are formulated, mainly including: policies based on when data is accessed, policies based on how often data is accessed, policies based on access time and frequency, etc.
Based on the strategy of the data access time, such as LRU (least recent used), a linked list is formed according to the sequence of the Cache blocks used, and the oldest data is replaced according to the rule that the oldest data is replaced first. Policies based on how often data is accessed, such as LFU (LeastFrequantly used). Similar to LRU, the difference is that it is sorted by frequency of use, with replacement being done according to the rule that the least used data is replaced first. The strategy based on the access time and the access frequency is a strategy combining the access time point and the access frequency, for example, LRU-K needs to maintain a queue for recording the history of all cache data accessed. Only when the access times of the data reach K times, the data are put into the cache. When the data needs to be eliminated, the LRU-K eliminates the data with the Kth access time being the largest from the current time. And when K is 1, it is equivalent to LRU. The strategies are applied in various occasions, each caching strategy has respective advantages and disadvantages and a suitable application scene, and the situation that a certain strategy is superior to all other strategies in any scene does not exist.
However, since the storage system has various data types and the access modes of users and upper-layer applications to the stored data are different, it is difficult to specify a caching policy for the storage system. It is common practice for a storage system administrator to select from a variety of caching policies supported by the storage system through an understanding of the storage application scenario, and to switch as the storage system operates. However, this method requires a storage system administrator to have high expertise and high error probability, and meanwhile, the corresponding cache policy switching cannot be made in time for the change of the storage system access mode.
Disclosure of Invention
The invention provides an intelligent cache strategy storage method, device and equipment for real-time data application, which automatically select an optimal strategy according to the change of a user business process, avoid the manual selection of a cache strategy and improve the hit rate of cache.
In order to achieve the purpose, the technical scheme of the invention is as follows:
the application discloses an intelligent cache strategy storage method for real-time data application, which comprises the following steps:
executing the current cache strategy, and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period; wherein the current cache policy refers to a cache policy currently executed by the storage system;
when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time, acquiring the data index values of all data stored in the current cache;
when the current cache strategy is executed, simulating and operating a candidate cache strategy in a cache strategy set according to the data index value, and periodically acquiring the cache performance parameters of the current cache strategy and the cache performance parameters of the candidate cache strategy by taking preset time as a period; at least two caching strategies are preset in the caching strategy set; the candidate caching strategy refers to a caching strategy in the caching strategy set except for a caching strategy currently executed by the storage system;
if the cache performance parameter of at least one candidate cache strategy is larger than the corresponding preset evaluation value, determining a replacement cache strategy in the at least one candidate cache strategy, and updating the currently executed cache strategy into the replacement cache strategy.
Optionally, the method further comprises:
if the cache performance parameter of the candidate cache strategy is not greater than the corresponding preset evaluation value, continuing to execute the current cache strategy, simulating to operate the candidate cache strategy in the cache strategy set according to the data index value, and periodically acquiring the cache performance parameter of the current cache strategy and the cache performance parameter of the candidate cache strategy by taking preset time as a period until it is determined that the replacement cache strategy or the operating time of the candidate cache strategy reaches a preset time threshold.
Optionally, the method further comprises:
and if the replacement cache strategy is not determined and the running time of the candidate cache strategy reaches the preset time threshold, continuously executing the current cache strategy and adjusting the corresponding preset evaluation value.
Optionally, the determining a replacement caching policy in the at least one candidate caching policy includes:
and determining the candidate cache policy with the significant advantage between the cache performance parameter and the cache performance parameter of the current cache policy in the at least one candidate cache policy as a replacement cache policy.
Optionally, the simulating the candidate caching policy in the running caching policy set according to the data index value includes:
simulating a candidate cache strategy in the running cache strategy set according to the data index value, and only storing the data index value when the candidate cache strategy executes the storage step;
and when the candidate cache strategy executes the deleting step, deleting the stored data index value.
Optionally, before the executing the current caching policy and periodically obtaining the caching performance parameter of the current caching policy with a preset time as a period, the method further includes:
determining a current cache strategy in the cache strategy set, and counting information required by each cache strategy recorded in the cache strategy set;
the executing the current caching policy includes:
and executing the current cache strategy and collecting information required by each cache strategy.
Optionally, the information required by the caching policy includes: data access time, data access frequency.
Optionally, the method further comprises:
and when the cache performance parameter of the current cache strategy is larger than the corresponding preset evaluation value and the execution time of the current cache strategy does not reach the preset cache strategy optimization time, continuing to execute the current cache strategy.
Further, an embodiment of the present invention further provides a device for storing an intelligent cache policy for real-time data application, where the device includes:
the execution unit is used for executing the current cache strategy and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period; wherein the current cache policy refers to a cache policy currently executed by the storage system;
the obtaining unit is used for obtaining data index values of all data stored in the current cache when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time;
the execution unit is further configured to, while executing the current cache policy, simulate to run a candidate cache policy in a cache policy set according to the data index value, and obtain the cache performance parameters of the candidate cache policy while periodically obtaining the cache performance parameters of the current cache policy with a preset time as a period; at least two caching strategies are preset in the caching strategy set; the candidate caching strategy refers to a caching strategy in the caching strategy set except for a caching strategy currently executed by the storage system;
and the processing unit is used for determining a replacement cache policy in the at least one candidate cache policy and updating the currently executed cache policy into the replacement cache policy if the cache performance parameter of the at least one candidate cache policy is greater than the corresponding preset evaluation value.
Optionally, the executing unit is further configured to, if the cache performance parameter of the candidate cache policy is not greater than the preset evaluation value corresponding to the candidate cache policy, continue to execute the current cache policy, simulate to run the candidate cache policy in the cache policy set according to the data index value, and obtain the cache performance parameter of the candidate cache policy while periodically obtaining the cache performance parameter of the current cache policy with a preset time as a period until it is determined that the running time of replacing the cache policy or the candidate cache policy reaches a preset time threshold.
Optionally, the executing unit is further configured to, if the replacement cache policy is not determined and the running time of the candidate cache policy reaches the preset time threshold, continue to execute the current cache policy and adjust the preset evaluation value corresponding thereto.
Optionally, the processing unit is specifically configured to determine, as a replacement cache policy, a candidate cache policy with a significant advantage between a cache performance parameter and a cache performance parameter of the current cache policy in the at least one candidate cache policy.
Optionally, the execution unit is specifically configured to simulate a candidate cache policy in a running cache policy set according to the data index value, and only store the data index value when the candidate cache policy executes the storing step;
and when the candidate cache strategy executes the deleting step, deleting the stored data index value.
Optionally, the processing unit is further configured to determine a current cache policy in the cache policy set, and count information required by each cache policy recorded in the cache policy set.
The execution unit is specifically configured to execute the current cache policy and acquire information required by each cache policy.
Optionally, the information required by the caching policy includes: data access time, data access frequency.
Optionally, the executing unit is further configured to continue executing the current cache policy when the cache performance parameter of the current cache policy is greater than the corresponding preset evaluation value and the execution time of the current cache policy does not reach the preset cache policy optimization time.
Further, the present application also provides an electronic device, including:
a memory having a computer program stored thereon;
and the processor is used for executing the computer program in the memory to realize the intelligent cache policy storage method for the real-time data application in the embodiment.
Through the technical scheme, the application discloses a real-time data application-oriented intelligent cache strategy storage method, a real-time data application-oriented intelligent cache strategy storage device and real-time data application-oriented intelligent cache strategy storage equipment, and the method comprises the following steps: executing the current cache strategy, and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period; wherein the current cache policy refers to a cache policy currently executed by the storage system; when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time, acquiring the data index values of all data stored in the current cache; when the current cache strategy is executed, simulating and operating a candidate cache strategy in a cache strategy set according to the data index value, and periodically acquiring the cache performance parameters of the current cache strategy and the cache performance parameters of the candidate cache strategy by taking preset time as a period; at least two caching strategies are preset in the caching strategy set; the candidate caching strategy refers to a caching strategy in the caching strategy set except for a caching strategy currently executed by the storage system; if the cache performance parameter of at least one candidate cache strategy is larger than the corresponding preset evaluation value, determining a replacement cache strategy in the at least one candidate cache strategy, and updating the currently executed cache strategy into the replacement cache strategy. Therefore, in the present application, a current cache policy is executed, when a cache performance parameter of the current cache policy is smaller than a preset evaluation value or an execution time of the current cache policy reaches a preset cache policy optimization time, a data index value of all data stored in the current cache is obtained, a candidate cache policy is simulated to be executed according to the data index value, cache performance parameters of the candidate cache policy are obtained, when the cache performance parameter of the candidate cache policy is larger than the corresponding preset evaluation value, a replacement cache policy can be determined in the candidate cache policy of which the cache performance parameter is larger than the corresponding preset evaluation value, and the currently executed cache policy is updated to the replacement cache policy. That is to say, in the present application, the optimal policy can be automatically selected according to the change of the user business process, thereby avoiding the manual selection of the cache policy and improving the hit rate of the cache.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a schematic flowchart of an intelligent cache policy storage method for real-time data application according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of another intelligent cache policy storage method for real-time data application according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus for storing an intelligent cache policy for real-time data application according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of another electronic device according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that all the directional indications (such as up, down, left, right, front, and back) in the embodiments of the present invention are only used to explain the relative position relationship between the components, the motion situation, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indication is changed accordingly.
In addition, the descriptions related to "first", "second", etc. in the present invention are only for descriptive purposes and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
In the present invention, unless otherwise expressly stated or limited, the terms "connected," "secured," and the like are to be construed broadly, and for example, "secured" may be a fixed connection, a removable connection, or an integral part; can be mechanically or electrically connected; they may be directly connected or indirectly connected through intervening media, or they may be connected internally or in any other suitable relationship, unless expressly stated otherwise. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
As shown in fig. 1, the present invention provides an intelligent cache policy storage method for real-time data application, including:
step S101, executing the current cache strategy, and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period.
The current cache policy refers to a cache policy currently executed by the storage system.
Specifically, the user may preset a caching policy that the storage system can support according to actual needs, so as to form a caching policy set. In the cache policy set, one cache policy may be selected as the current cache policy according to actual requirements. Of course, one caching policy can be arbitrarily selected as the current caching policy. And executing the current caching strategy after the current caching strategy is determined. The method has the advantages that whether the current cache strategy is the optimal cache strategy or not needs to be monitored, so that the cache performance parameters of the current cache strategy can be monitored in real time, the optimization time of the cache strategy can be set, and the currently executed cache strategy is optimized and selected once in each period of time. When the execution time of the current cache strategy does not reach the optimization time of the cache strategy, whether the current cache strategy is the optimal cache strategy or not can be periodically judged by periodically detecting the cache performance parameters of the current cache strategy. At this time, the preset time is taken as a period, and the cache performance parameters of the current cache strategy are periodically acquired. That is, when the current cache policy is executed, the cache performance parameter of the current cache policy may be monitored in real time, and then when the acquisition cycle is reached, the cache performance parameter is directly acquired.
It should be noted that the cache performance parameter refers to a relevant parameter capable of indicating a cache policy, and may be, for example, a cache hit rate.
For example, assume that three caching policies, namely caching policy 1, caching policy 2, and caching policy 3, are set in the caching policy set in advance. Assuming that the current caching policy is caching policy 1, the user sets the preset time to t 1. The cache performance parameter is a cache hit rate. At this time, the cache policy 1 is executed, and the cache hit rate of the cache policy 1 is periodically obtained with t1 as a cycle.
It should be noted that, the cache policy optimization time is preset by the user according to actual needs.
Step S102, when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time, acquiring the data index values of all data stored in the current cache.
Specifically, after the cache performance parameter of the current cache policy is obtained, whether the current cache policy is the optimal cache policy or not may be detected according to the cache performance parameter. At this time, the acquired cache performance parameter of the current cache policy may be compared with the corresponding preset evaluation value, and when the acquired cache performance parameter of the current cache policy is smaller than the corresponding preset evaluation value, it indicates that the hit rate of the current cache policy is low, and it may be detected whether there is a cache policy that is better than the currently executed cache policy in the cache policy set. At this time, the data index values of all the stored data may be acquired from the current cache.
It should be noted that the preset evaluation value corresponding to the cache performance parameter of the current cache policy is preset according to an actual requirement.
Or, when the execution time of the current cache policy reaches the optimization time of the preset cache policy, it indicates that it needs to detect whether there is a cache policy better than the currently executed cache policy in the cache policy set, and at this time, the data index values of all stored data may be obtained from the current cache. Optionally, according to the number of candidate caching policies in the caching policy set, a data index value may be obtained for each candidate caching policy.
It should be noted that, data is generally stored in a cache in a Key-Value pair form, where a Key is a unique flag sent from an application layer to distinguish a Value, that is, a data index Value. Further, in order to speed up the search and reduce the memory occupation space of the data index value, the content of Key is calculated by MD5 inside the cache and then stored in the data structure of the data index value. For each candidate cache strategy, only the data index value is stored, but not the actual data, so that a smaller memory is occupied, and the memory space can be saved.
It should be noted that the MD5 calculation is a common calculation method in the prior art, and is not described in detail in the present invention.
As described in the above example, after the cache hit rate of the cache policy 1 is obtained, the cache hit rate of the cache policy 1 is compared with the preset evaluation value 1 corresponding to the cache hit rate, that is, the preset cache performance index 1, and when the cache hit rate of the cache policy 1 is smaller than the preset evaluation value 1 corresponding to the cache hit rate, it is indicated that the cache hit rate of the cache policy 1 is lower, and at this time, the data index values of all data stored in the current cache can be obtained. Or, the execution time of the cache policy 1 reaches the preset cache policy optimization time, which indicates that the time for periodically optimizing the cache policy reaches, and at this time, the data index values of all data stored in the current cache need to be acquired.
Step S103, while executing the current cache strategy, simulating and operating a candidate cache strategy in the cache strategy set according to the data index value, and periodically acquiring the cache performance parameter of the current cache strategy and the cache performance parameter of the candidate cache strategy by taking preset time as a period.
At least two caching strategies are preset in the caching strategy set. The candidate caching strategy refers to a caching strategy in the caching strategy set except the caching strategy currently executed by the storage system.
Specifically, when a user presets a caching policy set, at least two caching policies are set. Such as LRU and LFU. When determining whether a cache policy better than the currently executed cache policy exists in the cache policy set, continuing to execute the current cache policy, and simulating to run the candidate cache policy in the cache policy set when executing the current cache policy. Because a data index value is obtained for each candidate cache strategy, when each candidate cache strategy is simulated to run, the virtual cache function can be realized by taking the data index value corresponding to each candidate cache strategy as a starting point according to the own cache mechanism. In addition, when each candidate caching strategy is simulated and operated, the input and output scenes of the storage system are only simulated, and the input and output scenes do not really participate in the input and output of the storage system. For example, if the data index value of the required data is found in the virtual cache, the cache is considered to be hit, where only the data index value is in the virtual cache, and there is no actual data. If the data index value of the desired data is not found in the virtual cache, the cache is considered to be a miss.
Further, simulating the candidate caching policies in the running caching policy set according to the data index value comprises:
and simulating and operating the candidate caching strategies in the caching strategy set according to the data index values, and only storing the data index values when the candidate caching strategies execute the storing step. And when the candidate cache strategy executes the deleting step, deleting the stored data index value.
In this embodiment, when the candidate caching policies are simulated and run according to the data index values, because each candidate caching policy is only for simulating the input/output scenario of the storage system and does not actually participate in the input/output of the storage system, and only the data index value of the data is stored in the cache, and actual data is not cached, when the candidate caching policies are executed, if a storage step needs to be executed, the data is added to the cache corresponding to the candidate caching policy, and at this time, only the data index value of the data to be stored is stored in the cache corresponding to the candidate caching policy. When the cache corresponding to the candidate cache strategy is full and the data stored in the cache is required to be deleted, the operation of deleting the cache data can be completed by directly deleting the data index value of the data stored in the cache after the data required to be deleted is determined according to the cache mechanism of the candidate cache strategy. That is, updating the virtual cache when the candidate cache policy is simulated and executed is the same as when the candidate cache policy is actually executed, and the difference is that only the data index value of the data is stored or deleted, and no real data exists when the virtual cache is updated by simulating and executing the cache policy.
It should be noted that, when the candidate cache policy is simulated to run, the current cache policy is also executed at the same time, so that the input and output of all data are actually generated by the user application, and the data index value is already used when the current cache policy is executed, and does not need to be generated separately when the candidate cache policy is simulated to run.
And when the caching performance parameters of the current caching strategy are periodically acquired by taking the preset time as a period, the caching performance parameters of the candidate caching strategy which is simulated to run are acquired at the same time. That is, the performance of the current cache policy currently being executed and the candidate cache policy for simulation operation are evaluated at intervals. At this time, the cache performance parameters of the current cache policy and the cache performance parameters of the candidate cache policies in simulated operation may be obtained, so as to evaluate the performance of the current cache policy and the candidate cache policies in simulated operation that are currently executed according to the cache performance parameters of each cache policy.
For example, when the cache performance parameter is a cache hit rate, the cache hit rates of the current cache policy and each candidate cache policy may be periodically calculated.
As described above, while continuing to execute the caching policy 1, the caching policies 2 and 3 are simulated to be executed according to the data index values. When the cache policies 2 and 3 are simulated to run, the caches corresponding to the cache policies 2 and 3 only store the data index values obtained in the above steps. At this time, the cache is updated with the data index value stored in each cache as a starting point. When the cache policies 2 and 3 are simulated to run, if the caches are updated, only the data index values stored in the respective caches are updated. For example, if the cache policy 2 needs to store data into the cache according to its own caching mechanism, the data index value of the data is stored into the cache, and the data is not stored. If the cache policy 3 needs to delete the data from the cache according to its own caching mechanism, the data index value of the data is deleted from the cache.
In addition, when the cache policies 2 and 3 are simulated to run, the cache performance of the cache policies 2 and 3 also needs to be monitored. At this time, while the cache hit rate of the cache policy 1 is periodically obtained, the cache hit rates of the cache policies 2 and 3 need to be obtained. That is, the cache hit rates of the cache policy 1, the cache policy 2, and the cache policy 3 are periodically obtained with t1 as a cycle.
Step S104, if the cache performance parameter of at least one candidate cache strategy is larger than the corresponding preset evaluation value, determining a replacement cache strategy in the at least one candidate cache strategy, and updating the currently executed cache strategy into the replacement cache strategy.
Specifically, after the cache performance parameters of the candidate cache policies are obtained, the cache performance parameters of each candidate cache policy are compared with the respective corresponding preset evaluation values, if at least one cache performance parameter of the candidate cache policy is larger than the corresponding preset evaluation value, an optimal candidate cache policy is selected as a replacement cache policy from at least one candidate cache policy with the cache performance parameter larger than the corresponding preset evaluation value, and the currently executed cache policy is updated to the replacement cache policy.
It should be noted that each cache policy is preset with a corresponding evaluation value, and the preset evaluation values corresponding to different cache policies may be different or the same.
Further, to avoid frequent replacement of the cache policy, the cache performance parameter of the at least one candidate cache policy may be compared with the cache performance parameter of the current cache policy. And if the cache performance parameters in at least one candidate cache strategy are not obviously improved compared with the cache performance parameters of the current cache strategy, not replacing the cache strategy. If the performance parameter of each cache in the at least one candidate cache policy is significantly improved compared with the performance parameter of the current cache policy, the candidate cache policy whose performance parameter of the cache is significantly improved compared with the performance parameter of the current cache policy may be determined as the replacement cache policy, and the currently executed cache policy may be updated.
At this time, determining a replacement caching policy from the at least one candidate caching policy includes:
and determining the candidate cache strategy with the significant advantage between the cache performance parameter and the cache performance parameter of the current cache strategy in at least one candidate cache strategy as a replacement cache strategy.
That is to say, after determining at least one candidate cache policy whose cache performance parameter is greater than the corresponding preset evaluation value, comparing the cache performance parameter of the at least one candidate cache policy with the cache performance parameter of the current cache policy, and if a cache performance parameter having a significant advantage with respect to the cache performance parameter of the current cache policy exists in the cache performance parameter of the at least one candidate cache policy, determining the candidate cache policy having the significant advantage as a replacement cache policy.
The parameter A has significant advantage over the parameter B, which means that the advantage value of the parameter A over the parameter B exceeds the significance threshold. The significance threshold is preset according to actual requirements and can be 95%, 99% and the like.
It should be noted that whether there is a significant advantage between the parameter a and the parameter B can be known by performing a significance test. The significance test can be similar to the A/B test. When performing A/B testing, two (or more) sets are typically employed: group A and group B. The first group is a control group and the second group will change some of these factors. The purpose of the a/B test is to try to see if the new design/process has changed conversion statistically significantly.
When the sample amount is large, the test can be performed by adopting a two-sample single-tail z-test (two-sample, one-tailed z-test). Whereas for smaller sample sets, testing may be performed relying on a t-test approach. The cache performance parameters used to represent the two cache policies, such as the cache hit rate, are tested for significant positive differences.
The double-sample single-tail z-test and t-test are both display test methods in the prior art, and are not described herein again.
Further, if the cache performance parameter of at least one candidate cache policy has no significant advantage over the cache performance parameter of the current cache policy, it is determined that the cache policy is not replaced, the current cache policy is continuously executed, and when the running time for simulating running of the candidate cache policy does not reach the preset time threshold, the candidate cache policy needs to be continuously executed while the current cache policy is continuously executed.
It should be noted that the preset time threshold is a time threshold preset according to actual requirements.
In the process, the actual performance of each candidate cache strategy can be fully known by simulating a real-time client service scene before a new cache strategy is used, so that the problem of selecting a wrong cache strategy due to prediction errors caused by automatic cache strategy switching based on a prediction model is solved. The working mechanism of the simulated running candidate caching strategy is consistent with that of the actual use scene, and the realization is convenient. And meanwhile, a user-defined new storage caching strategy is supported. The simulation running candidate caching strategy is a process of virtualizing the cache only by using the data index value, and the additional cost is small. And in the process of simulating the running of the candidate cache strategies, when the candidate cache strategies are found to be remarkably improved compared with the current cache strategies or the running time of the simulated candidate cache strategies exceeds a preset time threshold, the simulation process is automatically terminated, and resources are saved.
As described in the above example, after the cache hit rates of the cache policies 2 and 3 are obtained, the cache hit rates of the cache policies 2 and 3 are respectively compared with their respective corresponding preset evaluation values, that is, the cache hit rate of the cache policy 2 is compared with its corresponding preset evaluation value 2, and the cache hit rate of the cache policy 3 is compared with its corresponding preset evaluation value 3. If the cache hit rate of the cache policy 2 is not greater than the corresponding preset evaluation value 2, it indicates that the cache policy 2 has a low hit rate and cannot be used as a replacement cache policy. If the cache hit rate of the cache policy 3 is greater than the corresponding preset evaluation value 3, and the cache hit rate of the cache policy 3 has a significant advantage compared with the cache hit rate of the cache policy 1, the cache policy 3 may be determined as a replacement cache policy, and the currently executed cache policy may be updated to the cache policy 3.
Further, as shown in fig. 2, the method further includes:
step S105, if the cache performance parameters of the candidate cache policies are not larger than the corresponding preset evaluation values, continuing to execute the current cache policy, simulating and operating the candidate cache policies in the cache policy set according to the data index values, and acquiring the cache performance parameters of the candidate cache policies while periodically acquiring the cache performance parameters of the current cache policy by taking preset time as a period until it is determined that the replacement cache policy or the operating time of the candidate cache policy reaches a preset time threshold.
Specifically, when the cache performance parameters of the candidate cache policies are not greater than the corresponding preset evaluation values, it is indicated that the cache performance in the candidate cache policies is not particularly good, and at this time, the current cache policy needs to be continuously executed, and the candidate cache policies are continuously simulated and operated. The candidate caching policies also continue to be periodically evaluated while the current caching policy is periodically evaluated. That is, when none of the cache performance parameters of the candidate cache policies is greater than the corresponding preset evaluation value, the process jumps to step S103, and continues to execute step S103. And repeating the process until the running time of the replacement cache strategy or the candidate cache strategy is determined to reach the preset time threshold. After the replacement cache policy is determined, the currently executed cache policy may be updated to the replacement cache policy. If the replacement cache policy is not determined, but the running time of the candidate cache policy reaches the preset time threshold, the process no longer jumps to step S103, and the following step S106 is executed.
In order to prevent the above steps from being executed in an infinite loop, the running time for simulating the running of the candidate cache policy is limited, and a time threshold is set, so that even if the replacement cache policy is not determined, but the running time of the candidate cache policy reaches the preset time threshold, the step S103 is not executed in a jump, but the step S106 described below is executed continuously.
As described in the above example, if the cache hit rate of the cache policy 2 is smaller than the corresponding preset evaluation value 2, and the cache hit rate of the cache policy 3 is smaller than the corresponding preset evaluation value 3, step S103 is continuously performed, that is, the cache policy 1 is continuously performed, the cache policies 2 and 3 are simulated to be operated, and the cache hit rates of the cache policy 1, the cache policy 2 and the cache policy 3 are periodically obtained with t1 as a period, until it is determined that the cache policy is replaced, or the operating time of the candidate cache policy reaches the preset time threshold, the simulated operation of the candidate cache policy may be automatically terminated. After determining the replacement cache policy, the currently executed cache policy may be directly replaced with the replacement cache policy, and step S101 may be executed again. And after the running time of the simulation running cache strategy 2 and the cache strategy 3 reaches a preset time threshold, continuing to execute the following steps.
Further, referring to fig. 2, the method further includes:
and S106, if the replacement cache strategy is not determined and the running time of the candidate cache strategy reaches a preset time threshold, continuing to execute the current cache strategy and adjusting a corresponding preset evaluation value.
Specifically, in the process of executing the above steps, if the replacement cache policy is not determined and the running time of the candidate cache policy reaches the preset time threshold, it indicates that the cache performance in the candidate cache policy is not better than that of the current cache policy, at this time, the current cache policy is continuously executed, and the preset evaluation value corresponding to the current cache policy is reduced, that is, the cache performance index is reduced.
It should be noted that the evaluation value corresponding to the cache performance parameter of the cache policy is a cache performance index, and the value may be dynamically adjusted.
It should be noted that the cache performance index changes according to the current best performance, and the periodic cache policy optimization is to ensure that the best performance is updated. When the cache performance parameter value of the current cache strategy is higher, the cache performance index is larger. And when the access mode of the storage system changes and the cache performance parameter value of the current cache strategy is reduced, the storage system automatically triggers a cache strategy optimization mechanism, finds the optimal cache strategy from the cache strategy set, updates the corresponding preset evaluation value, namely the cache performance index, and possibly increases or decreases.
As described in the above example, if it is not determined to replace the cache policy and the running time of the candidate cache policy reaches the preset time threshold, it indicates that the cache hit rate of each cache policy in the cache policy set is low, at this time, the cache policy 1 is continuously executed, the preset evaluation value 1 corresponding to the cache policy 1 is adjusted, and the preset evaluation value 1 is reduced.
Further, referring to fig. 2, before step S101, the method further includes:
and S107, determining the current cache strategy in the cache strategy set, and counting the information required by each cache strategy recorded in the cache strategy set.
It should be noted that the information required by the cache policy refers to information that must be used by the cache policy execution to add data to the cache, or delete data in the cache from the cache, that is, to update data in the cache.
Optionally, the information required by the caching policy includes: data access time, data access frequency.
Specifically, after the user determines the cache policy set of the storage system, the cache policy currently required to be executed may be selected from the cache policy set, that is, the current cache policy is selected. At this time, the user may specify a cache policy as the current cache policy according to actual requirements, and of course, the storage system may also select the current cache policy from the cache policy set by itself, which is not limited in the present application. After the current cache strategy is determined, information required when each cache strategy in the cache strategy set is updated in the cache can be counted, so that each cache strategy can be updated conveniently in the following process.
At this time, the executing the current caching policy in step S101 includes: and executing the current cache strategy and collecting information required by each cache strategy.
That is, when the storage system executes the current cache policy, information required by each cache policy, such as access time of data, data access frequency, and the like, may be collected in real time. In the application, only information which must be used by the candidate storage caching strategy, such as data access time, data access frequency and the like, needs to be collected. Extra information does not need to be collected, and resources of the storage system are saved.
As described in the above example, before the storage system operates, a user may set a caching policy set, and the storage system determines the caching policy 1 as the current caching policy in the caching policy set. Because the information required by updating the cache of each cache strategy is different, the information required by each cache strategy needs to be collected in real time so as to update the cache. At this time, when the cache policy 1 is executed, information required by each cache policy needs to be collected.
Further, referring to fig. 2, the method further includes:
and step S108, when the cache performance parameter of the current cache strategy is larger than the corresponding preset evaluation value and the execution time of the current cache strategy does not reach the preset cache strategy optimization time, continuing to execute the current cache strategy.
Specifically, when the cache performance parameter of the current cache policy is greater than the corresponding preset evaluation value, it indicates that the hit rate of the current cache policy is relatively high, and the cache policy does not need to be replaced.
As described in the above example, when the cache hit rate of the cache policy 1 is greater than the preset evaluation value 1, it is indicated that the hit rate of the cache policy 1 is higher, and the cache policy 1 is continuously executed without updating the cache policy.
In the application, a current cache strategy is executed, when a cache performance parameter of the current cache strategy is smaller than a preset evaluation value or an execution time of the current cache strategy reaches a preset cache strategy optimization time, a data index value of all data stored in the current cache is acquired, a candidate cache strategy is simulated according to the data index value, cache performance parameters of the candidate cache strategy are acquired, when the cache performance parameter of the candidate cache strategy is larger than the corresponding preset evaluation value, a replacement cache strategy can be determined in the candidate cache strategy of which the cache performance parameter is larger than the corresponding preset evaluation value, and the currently executed cache strategy is updated to the replacement cache strategy. That is to say, in the application, the optimal strategy can be automatically selected according to the change of the user business process, so that the manual selection of the cache strategy is avoided, and the cache hit rate is improved.
As shown in fig. 3, the present application discloses an apparatus 300 for storing an intelligent cache policy for real-time data application, including:
the execution unit 301 is configured to execute the current cache policy, and periodically obtain the cache performance parameter of the current cache policy by using a preset time as a period.
Wherein the current caching policy refers to a caching policy currently executed by the storage system.
An obtaining unit 302, configured to obtain data index values of all data stored in the current cache if the cache performance parameter of the current cache policy is smaller than the corresponding preset evaluation value or the execution time of the current cache policy reaches a preset cache policy optimization time.
The execution unit 301 is further configured to, while executing the current cache policy, simulate to run a candidate cache policy in the cache policy set according to the data index value, and obtain the cache performance parameter of the candidate cache policy while periodically obtaining the cache performance parameter of the current cache policy with a preset time as a period.
At least two caching strategies are preset in the caching strategy set. The candidate caching strategy refers to a caching strategy in the caching strategy set except the caching strategy currently executed by the storage system.
Specifically, the executing unit 301 is specifically configured to simulate a candidate cache policy in the running cache policy set according to the data index value, and only store the data index value when the candidate cache policy executes the storing step;
and when the candidate cache strategy executes the deleting step, deleting the stored data index value.
The processing unit 303 is configured to determine a replacement cache policy in the at least one candidate cache policy if the cache performance parameter of the at least one candidate cache policy is greater than the preset evaluation value corresponding to the candidate cache policy, and update a currently executed cache policy to the replacement cache policy.
Specifically, the processing unit 303 is specifically configured to determine, as a replacement cache policy, a candidate cache policy in which a significant advantage exists between a cache performance parameter and a cache performance parameter of the current cache policy in the at least one candidate cache policy.
Further, the executing unit 301 is further configured to, if the cache performance parameter of the candidate cache policy is not greater than the corresponding preset evaluation value, continue to execute the current cache policy, simulate to run the candidate cache policy in the cache policy set according to the data index value, and obtain the cache performance parameter of the candidate cache policy while periodically obtaining the cache performance parameter of the current cache policy with a preset time as a period until it is determined that the running time of replacing the cache policy or the candidate cache policy reaches a preset time threshold.
Further, the executing unit 301 is further configured to, if the replacement cache policy is not determined and the running time of the candidate cache policy reaches the preset time threshold, continue to execute the current cache policy and adjust the preset evaluation value corresponding to the current cache policy.
Further, the processing unit 303 is further configured to determine a current caching policy in the caching policy set, and count information required by each caching policy recorded in the caching policy set.
At this time, the execution unit 301 is specifically configured to execute the current cache policy and collect information required by each cache policy.
Optionally, the information required by the caching policy includes: data access time, data access frequency.
Further, the executing unit 301 is further configured to continue to execute the current cache policy when the cache performance parameter of the current cache policy is greater than the corresponding preset evaluation value and the execution time of the current cache policy does not reach the preset cache policy optimization time.
In the application, a current cache strategy is executed, when a cache performance parameter of the current cache strategy is smaller than a preset evaluation value or an execution time of the current cache strategy reaches a preset cache strategy optimization time, a data index value of all data stored in the current cache is acquired, a candidate cache strategy is simulated according to the data index value, cache performance parameters of the candidate cache strategy are acquired, when the cache performance parameter of the candidate cache strategy is larger than the corresponding preset evaluation value, a replacement cache strategy can be determined in the candidate cache strategy of which the cache performance parameter is larger than the corresponding preset evaluation value, and the currently executed cache strategy is updated to the replacement cache strategy. That is to say, in the application, the optimal policy can be automatically selected according to the change of the user business process, so that the cache policy is prevented from being manually selected, the cache hit rate is improved, the cache policy is automatically changed, the participation of a storage system administrator is not needed, the cache policy is prevented from being manually selected, and the cache hit rate is improved.
Fig. 4 is a block diagram illustrating an electronic device 400 according to an example embodiment. As shown in fig. 4, the electronic device 400 may include: a processor 401 and a memory 402. The electronic device 400 may also include one or more of a multimedia component 403, an input/output (I/O) interface 404, and a communications component 405.
The processor 401 is configured to control the overall operation of the electronic device 400, so as to complete all or part of the steps in the above-mentioned intelligent cache policy storage method for real-time data application. The memory 402 is used to store various types of data to support operation at the electronic device 400, such as instructions for any application or method operating on the electronic device 400 and application-related data, such as contact data, transmitted and received messages, pictures, audio, video, and so forth. The Memory 402 may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk or optical disk. The multimedia components 403 may include a screen and an audio component. Wherein the screen may be, for example, a touch screen and the audio component is used for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may further be stored in the memory 402 or transmitted through the communication component 405. The audio assembly also includes at least one speaker for outputting audio signals. The I/O interface 404 provides an interface between the processor 401 and other interface modules, such as a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 405 is used for wired or wireless communication between the electronic device 400 and other devices. Wireless Communication, such as Wi-Fi, bluetooth, Near Field Communication (NFC), 2G, 3G, 4G, NB-IOT, eMTC, or other 5G, etc., or a combination of one or more of them, which is not limited herein. The corresponding communication component 405 may therefore include: Wi-Fi module, Bluetooth module, NFC module, etc.
In an exemplary embodiment, the electronic Device 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic components, for performing the above-mentioned smart cache policy storage method for real-time data applications.
In another exemplary embodiment, a computer readable storage medium is also provided, which includes program instructions, which when executed by a processor, implement the steps of the above-mentioned real-time data application-oriented intelligent cache policy storage method. For example, the computer readable storage medium may be the memory 402 comprising program instructions executable by the processor 401 of the electronic device 400 to perform the intelligent cache policy storage method for real-time data applications described above.
Fig. 5 is a block diagram illustrating an electronic device 500 in accordance with an example embodiment. For example, the electronic device 500 may be provided as a server. Referring to fig. 5, the electronic device 500 includes a processor 510, which may be one or more in number, and a memory 520 for storing computer programs executable by the processor 710. The computer program stored in memory 520 may include one or more modules that each correspond to a set of instructions. Further, the processor 510 may be configured to execute the computer program to perform the above-described intelligent cache policy storage method for real-time data applications.
Additionally, the electronic device 500 may also include a power component 530 and a communication component 540, the power component 530 may be configured to perform power management of the electronic device 500, and the communication component 540 may be configured to enable communication of the electronic device 500, e.g., wired or wireless communication. In addition, the electronic device 500 may also include input/output (I/O) interfaces 550. The electronic device 500 may operate based on an operating system stored in the memory 520, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, and the like.
In another exemplary embodiment, a computer readable storage medium is also provided, which includes program instructions, which when executed by a processor, implement the steps of the above-mentioned real-time data application-oriented intelligent cache policy storage method. For example, the computer readable storage medium may be the memory 520 comprising program instructions executable by the processor 510 of the electronic device 500 to perform the intelligent cache policy storage method for real-time data applications described above.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned intelligent cache policy storage method for real-time data application when being executed by the programmable apparatus.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. An intelligent cache strategy storage method for real-time data application is characterized by comprising the following steps:
executing the current cache strategy, and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period; wherein the current cache policy refers to a cache policy currently executed by the storage system;
when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time, acquiring the data index values of all data stored in the current cache;
when the current cache strategy is executed, simulating and operating a candidate cache strategy in a cache strategy set according to the data index value, and periodically acquiring the cache performance parameters of the current cache strategy and the cache performance parameters of the candidate cache strategy by taking preset time as a period; at least two caching strategies are preset in the caching strategy set; the candidate caching strategy refers to a caching strategy in the caching strategy set except for a caching strategy currently executed by the storage system;
if the cache performance parameter of at least one candidate cache strategy is larger than the corresponding preset evaluation value, determining a replacement cache strategy in the at least one candidate cache strategy, and updating the currently executed cache strategy into the replacement cache strategy.
2. The method of claim 1, further comprising:
if the cache performance parameter of the candidate cache strategy is not greater than the corresponding preset evaluation value, continuing to execute the current cache strategy, simulating to operate the candidate cache strategy in the cache strategy set according to the data index value, and periodically acquiring the cache performance parameter of the current cache strategy and the cache performance parameter of the candidate cache strategy by taking preset time as a period until it is determined that the replacement cache strategy or the operating time of the candidate cache strategy reaches a preset time threshold.
3. The method of claim 2, further comprising:
and if the replacement cache strategy is not determined and the running time of the candidate cache strategy reaches the preset time threshold, continuously executing the current cache strategy and adjusting the corresponding preset evaluation value.
4. The method of claim 1, wherein said determining a replacement caching policy among said at least one candidate caching policy comprises:
and determining the candidate cache policy with the significant advantage between the cache performance parameter and the cache performance parameter of the current cache policy in the at least one candidate cache policy as a replacement cache policy.
5. The method of claim 1, wherein simulating a candidate cache policy in a set of running cache policies according to the data index value comprises:
simulating a candidate cache strategy in the running cache strategy set according to the data index value, and only storing the data index value when the candidate cache strategy executes the storage step;
and when the candidate cache strategy executes the deleting step, deleting the stored data index value.
6. The method of claim 1, wherein before the executing the current caching policy and periodically obtaining the caching performance parameter of the current caching policy with a preset time as a period, the method further comprises:
determining a current cache strategy in the cache strategy set, and counting information required by each cache strategy recorded in the cache strategy set;
the executing the current caching policy includes:
and executing the current cache strategy and collecting information required by each cache strategy.
7. The method of claim 6, wherein the information required by the caching policy comprises: data access time, data access frequency.
8. The method of claim 1, further comprising:
and when the cache performance parameter of the current cache strategy is larger than the corresponding preset evaluation value and the execution time of the current cache strategy does not reach the preset cache strategy optimization time, continuing to execute the current cache strategy.
9. An apparatus for intelligent cache policy storage for real-time data applications, comprising:
the execution unit is used for executing the current cache strategy and periodically acquiring the cache performance parameters of the current cache strategy by taking preset time as a period; wherein the current cache policy refers to a cache policy currently executed by the storage system;
the obtaining unit is used for obtaining data index values of all data stored in the current cache when the cache performance parameter of the current cache strategy is smaller than the corresponding preset evaluation value or the execution time of the current cache strategy reaches the preset cache strategy optimization time;
the execution unit is further configured to, while executing the current cache policy, simulate to run a candidate cache policy in a cache policy set according to the data index value, and obtain the cache performance parameters of the candidate cache policy while periodically obtaining the cache performance parameters of the current cache policy with a preset time as a period; at least two caching strategies are preset in the caching strategy set; the candidate caching strategy refers to a caching strategy in the caching strategy set except for a caching strategy currently executed by the storage system;
and the processing unit is used for determining a replacement cache policy in the at least one candidate cache policy and updating the currently executed cache policy into the replacement cache policy if the cache performance parameter of the at least one candidate cache policy is greater than the corresponding preset evaluation value.
10. An electronic device, comprising:
a memory having a computer program stored thereon;
a processor for executing the computer program in the memory to carry out the steps of the method of any one of claims 1 to 8.
CN202011271600.3A 2020-11-13 2020-11-13 Intelligent cache strategy storage method, device and equipment for real-time data application Pending CN112367402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011271600.3A CN112367402A (en) 2020-11-13 2020-11-13 Intelligent cache strategy storage method, device and equipment for real-time data application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011271600.3A CN112367402A (en) 2020-11-13 2020-11-13 Intelligent cache strategy storage method, device and equipment for real-time data application

Publications (1)

Publication Number Publication Date
CN112367402A true CN112367402A (en) 2021-02-12

Family

ID=74515566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011271600.3A Pending CN112367402A (en) 2020-11-13 2020-11-13 Intelligent cache strategy storage method, device and equipment for real-time data application

Country Status (1)

Country Link
CN (1) CN112367402A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107926A (en) * 2023-02-03 2023-05-12 摩尔线程智能科技(北京)有限责任公司 Cache replacement policy management method, device, equipment, medium and program product

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116107926A (en) * 2023-02-03 2023-05-12 摩尔线程智能科技(北京)有限责任公司 Cache replacement policy management method, device, equipment, medium and program product
CN116107926B (en) * 2023-02-03 2024-01-23 摩尔线程智能科技(北京)有限责任公司 Cache replacement policy management method, device, equipment, medium and program product

Similar Documents

Publication Publication Date Title
CN110751275B (en) Graph training system, data access method and device, electronic device and storage medium
CN111881133B (en) Storage bucket management method and device, computer equipment and readable storage medium
US9372898B2 (en) Enabling event prediction as an on-device service for mobile interaction
CN107943718B (en) Method and device for cleaning cache file
CN111375200B (en) Method and system for intelligently configuring game resources, computer storage medium and equipment
CN105376335A (en) Method and device for collection data uploading
CN106155750A (en) The loading method of a kind of resource file and device
WO2014183514A1 (en) Method, device, and computer storage medium for hierarchical storage
US11294805B2 (en) Fast and safe storage space reclamation for a data storage system
CN112583904A (en) File uploading method, device, equipment and storage medium
CN107704507B (en) Database processing method and device
CN118227595B (en) Data classification and storage method and device based on edge enabling
CN110413228A (en) A kind of mapping table management method, system and electronic equipment and storage medium
US10346281B2 (en) Obtaining and analyzing a reduced metric data set
CN112367402A (en) Intelligent cache strategy storage method, device and equipment for real-time data application
CN117235088B (en) Cache updating method, device, equipment, medium and platform of storage system
CN116756190A (en) Data cache management method, device, terminal equipment and storage medium
Pan et al. Penalty-and locality-aware memory allocation in Redis using enhanced AET
CN110704773B (en) Abnormal behavior detection method and system based on frequent behavior sequence mode
US11379375B1 (en) System and method for cache management
WO2024017177A1 (en) Method and apparatus for executing service, storage medium and device
CN112799910A (en) Hierarchical monitoring method and device
CN111078418B (en) Operation synchronization method, device, electronic equipment and computer readable storage medium
CN105094986B (en) A kind of prediction technique and device of the burst access behavior towards storage system
CN114253458A (en) Method, device and equipment for processing page fault exception of memory and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination