CN112764681A - Cache elimination method and device with weight judgment function and computer equipment - Google Patents

Cache elimination method and device with weight judgment function and computer equipment Download PDF

Info

Publication number
CN112764681A
CN112764681A CN202110080339.7A CN202110080339A CN112764681A CN 112764681 A CN112764681 A CN 112764681A CN 202110080339 A CN202110080339 A CN 202110080339A CN 112764681 A CN112764681 A CN 112764681A
Authority
CN
China
Prior art keywords
cache
data
weight
cached
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110080339.7A
Other languages
Chinese (zh)
Other versions
CN112764681B (en
Inventor
郭浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qiniu Information Technology Co ltd
Original Assignee
Shanghai Qiniu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qiniu Information Technology Co ltd filed Critical Shanghai Qiniu Information Technology Co ltd
Priority to CN202110080339.7A priority Critical patent/CN112764681B/en
Publication of CN112764681A publication Critical patent/CN112764681A/en
Application granted granted Critical
Publication of CN112764681B publication Critical patent/CN112764681B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/0604Improving or facilitating administration, e.g. storage management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0629Configuration or reconfiguration of storage systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0646Horizontal data movement in storage systems, i.e. moving data in between storage devices or systems
    • G06F3/0647Migration mechanisms
    • G06F3/0649Lifecycle management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0655Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
    • G06F3/0656Data buffering arrangements
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The application relates to a cache elimination method with weight judgment, a device and computer equipment, wherein the method comprises the steps of obtaining cache structure data to be inserted; judging whether the pre-stored cached data is hit or not based on the cache structure data to be inserted; if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; if so, inserting the cached data into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; removing the endmost cached data of the cache low-level region based on the low-weight eviction instruction. According to the invention, on one hand, the data to be cached is cached through the weight, and the data with high weight is cached for a long time through the weight setting, on the other hand, the LRU algorithm is also considered, when the cached data is accessed again, the cached data is cached for a longer time, the fine control of the cache is realized, and the high-efficiency cache is realized.

Description

Cache elimination method and device with weight judgment function and computer equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a cache elimination method with weight determination, an apparatus and a computer device.
Background
The common cache elimination methods are various, for example, the invention patent with the application number of CN201610720506.9 discloses a real-time adjustment method and device for cache elimination strategies, which samples services according to different cache elimination strategies and counts the cache hit rate of cache data of each cache elimination strategy in real time; calculating a switching overhead factor for switching the current cache elimination strategy into other cache elimination strategies according to the cache hit rate; and when the switching overhead factor is smaller than a preset threshold value, switching the current cache elimination strategy.
Although the technical solution disclosed in the above patent document can adjust the method and apparatus in real time, the characteristic parameters in the cache algorithm are dynamically adjusted by the real-time hit rate feedback of the cache data, so that the adaptability of the cache algorithm can be effectively improved, the hit rate of the cache is increased, and the performance of the whole system is improved. It still has obvious drawbacks such as not being able to implement long-time and efficient cache management for cached data.
Therefore, the current cache elimination method in the market has the technical problems of short cache time and incapability of realizing efficient cache management.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a cache elimination method with weight determination, an apparatus and a computer device, which are capable of caching data and achieving long-time and efficient cache management.
The technical scheme of the invention is as follows:
a cache elimination method with weight judgment comprises the following steps:
step S100: obtaining cache structure data to be inserted;
step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight;
step S400: if so, inserting the cache data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
step S500: and removing the tail-most cached data of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
Specifically, step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; then also comprises the following steps:
step S310: if not, generating a cache low-level region arrangement instruction;
step S320: and according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure.
Specifically, step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; then, the method further comprises the following steps:
step S210: if not, self-defining the current data weight of the cache structure data to be inserted;
step S220: and inserting the data of the cache structure to be inserted with the defined current data weight into a cache lower-level region in a basic cache data linked list structure.
Specifically, the cache low-level region in the basic cache data linked list structure comprises a first preset specific number of low-level region cache bits, and the low-level region cache bits are arranged in sequence;
step S320: according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure; then also comprises the following steps:
step S321: according to the cache low-level region arrangement instruction, arranging the cached data hit by the data to be inserted into the cache structure at a low-level region cache bit at a first position in the cache low-level region;
step S322: after the cached data hit by the data to be inserted into the cache structure are arranged in the cache bit of the lower-level area at the first position in the cache lower-level area, judging whether the number of cache bits occupied by the cached data in the cache lower-level area exceeds the first preset specific number;
step S323: if the cache data is judged to be the data in the last lower-level region cache bit in the cache lower-level region, the cached data in the last lower-level region cache bit in the cache lower-level region is removed from the cache lower-level region.
Specifically, the cache higher-level region in the basic cache data linked list structure comprises a second preset specific number of higher-level region cache bits, and the higher-level region cache bits are arranged in sequence;
step S400: if so, inserting the cached data hit by the data to be inserted into the cache structure into a cache higher-level area in the basic cache data linked list structure, specifically comprising:
if the cache structure data is judged to be the cache structure data, the cache data hit by the cache structure data to be inserted is inserted into the cache bit of the higher-level area of the cache higher-level area, wherein the cache bit is positioned at the first position.
A cache eviction apparatus with weight determination, the apparatus comprising:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judgment module is used for judging whether cached data in a prestored basic cache data linked list structure are hit or not based on the acquired to-be-inserted cache structure data; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
the second judgment module is used for judging whether the weight of cached data included in the cached data hit by the cache structure data to be inserted is greater than a preset first standard weight or not if the judgment is yes;
the data inserting module is used for inserting the cache data hit by the cache structure data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight eliminating instruction if the judgment result is yes;
and the redundant data removing module is used for removing the cached data at the tail end of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
Specifically, the second determining module further includes:
the low-level area arrangement instruction generation module is used for generating a cache low-level area arrangement instruction if the judgment result is negative;
and the cache low-level region data arrangement module is used for arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure according to the cache low-level region arrangement instruction.
Specifically, the first determining module further includes:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the judgment result is negative;
and the new data inserting module is used for inserting the cache structure data to be inserted with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
A computer device includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the above cache eviction method with weight determination when executing the computer program.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the above-mentioned cache eviction method with weight determination.
The invention has the following technical effects:
according to the cache elimination method with the weight judgment, the device and the computer equipment, the cache structure data to be inserted are obtained firstly; judging whether the cached data in the pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight; if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; if so, inserting the cache data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; and finally, removing the latest cached data of a cached lower-level area in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction, caching the data to be cached through the weight, caching the data with high weight for a longer time through weight setting on the one hand, and giving consideration to an LRU algorithm on the other hand, caching the cached data for a longer time when the cached data is accessed again, realizing fine control of the cache, and realizing high-efficiency cache.
Drawings
FIG. 1 is a flow chart illustrating a cache eviction method with weight determination in an embodiment;
FIG. 2 is a diagram illustrating stored data in a base cache data linked list structure, according to an embodiment;
FIG. 3 is a diagram illustrating stored data in a structure of a linked list of underlying cached data in another embodiment;
FIG. 4 is a diagram illustrating stored data in a base cache data linked list structure after new data is inserted in an embodiment;
FIG. 5 is a block diagram of an embodiment of a cache eviction apparatus with weight determination;
FIG. 6 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, a method for cache eviction with weight determination is provided, the method comprising:
step S100: obtaining cache structure data to be inserted;
specifically, the cache structure data to be inserted is data to be cached. Specifically, the data to be inserted into the cache structure has two cases, one is new data, and the other is data that has already been cached.
Step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
specifically, in this step, the structure of the basic cache data linked list is preset; specifically, before the step S200, the method further includes:
step S201: and acquiring a pre-stored basic cache data linked list structure, wherein the basic cache data linked list structure is realized based on a double structure of a bidirectional linked list and a hash table.
Specifically, by making the pre-stored basic cache data linked list structure based on a double structure of a bidirectional linked list and a hash table, the bidirectional linked list is fully utilized for conveniently inserting and removing nodes, and the hash table is utilized for conveniently searching whether the nodes are hit in the cache table, so that the efficiency of cache management is improved.
Step S202: segmenting the basic cache data linked list structure and dividing the basic cache data linked list structure into a cache high-level area and a cache low-level area;
in this embodiment, the cache high-level region and the cache low-level region are as shown in fig. 2, that is, in this embodiment, the basic cache data linked list structure is divided into two segments, which are the cache high-level region and the cache low-level region respectively.
Specifically, as shown in fig. 2, the regions where a8, s9, d7, f6, g8 and h9 are located are the cache higher level regions. The regions where J3, k5, l2, q1, w3 and e4 are located are the cache low-level regions.
Further, the cached data a, s, d, f, g, and h in the cache higher-level region and the cache lower-level region are the cached data identifier, and the subsequently carried numbers 8, 9, 7, 6, 8, and 9 are the cached data weight.
Step S203: and storing data into the cache high-level area and the cache low-level area, wherein the stored data are the cached data, and each cached data comprises a cached data identifier and a cached data weight.
Further, in this step, whether the cached data in the pre-stored basic cache data linked list structure is hit is determined based on the obtained cache structure data to be inserted, specifically as follows:
as shown in FIG. 3, the stored data in the data link list structure is cached for the base.
The data sources in FIG. 3 are:
when data y is newly inserted, and the weight of y is 8, then the data (y, 8) is inserted into the head of the cache low-level area, and the insertion is as shown in fig. 3.
Next, as shown in fig. 4, if the cache structure data to be inserted is obtained again and is also (y, 8), at this time, it is determined whether the cached data in the pre-stored basic cache data linked list structure is hit based on the obtained cache structure data to be inserted, that is, it is determined whether the cached data in the basic cache data linked list structure shown in fig. 3 is hit based on the obtained (y, 8). Then, go to step S300.
Step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight;
specifically, in this step, if the determination is yes, it is determined that the cached data in the pre-stored basic cache data linked list structure is hit based on the obtained to-be-inserted cache structure data.
The preset first standard weight is preset by self, when the preset first standard weight is larger than the preset first standard weight, the weight is higher, and the data larger than the preset first standard weight needs to be cached for a long time, namely, the data is placed into the cache high-level area.
In this step, if yes, that is, to determine that the obtained cache structure data (y, 8) to be inserted is cached data, it is determined whether the cached data weight of the cache structure data (y, 8) to be inserted is greater than a preset first standard weight.
Step S400: if so, inserting the cache data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
specifically, it is determined that the cache structure data to be inserted hit includes a cache data weight greater than a preset first standard weight, and therefore, at this time, the cache data to be inserted hit needs to be inserted into a cache higher-level area in the basic cache data linked list structure, that is, the cache structure data to be inserted (y, 8) is inserted into the cache higher-level area in the basic cache data linked list structure.
Further, in this embodiment, the data (y, 8) to be inserted into the cache structure is inserted into the head of line of the cache higher-level area in the basic cache data linked list structure, where the head of line is the first cache bit of the cache higher-level area.
Step S500: and removing the tail-most cached data of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
Specifically, the invention obtains the cache structure data to be inserted first; judging whether the cached data in the pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight; if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; if so, inserting the cache data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; and finally, removing the latest cached data of a cached lower-level area in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction, caching the data to be cached through the weight, caching the data with high weight for a longer time through weight setting on the one hand, and giving consideration to an LRU algorithm on the other hand, caching the cached data for a longer time when the cached data is accessed again, realizing fine control of the cache, and realizing high-efficiency cache.
In one embodiment, step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; then also comprises the following steps:
step S310: if not, generating a cache low-level region arrangement instruction;
specifically, it is determined that the weight of the cached data included in the cached data hit by the data of the cache structure to be inserted is not greater than the preset first standard weight, and at this time, the data of the cache structure to be inserted does not need to be cached for a long time, so that the cache low-level region placement instruction is generated.
Step S320: and according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure.
Specifically, when the to-be-inserted cache structure data does not need a long-term cache, but occurs again, and thus, the cached data hit by the to-be-inserted cache structure data is placed in the cache lower-level region in the underlying cache data linked list structure.
Further, the cached data hit by the data to be inserted into the cache structure is placed at the head of the cache lower-level area in the basic cache data linked list structure, i.e. the first cache bit in the cache lower-level area.
In one embodiment, step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; then, the method further comprises the following steps:
step S210: if not, self-defining the current data weight of the cache structure data to be inserted;
specifically, if the determination is negative, that is, it is determined that the obtained to-be-inserted cache structure data does not hit the cached data in the pre-stored basic cache data linked list structure, it indicates that the obtained to-be-inserted cache structure data is new data, and redefinition and caching are required.
Further, before caching, the current data weight of the cache structure data to be inserted needs to be customized. For example, the time t for acquiring the data is taken as a weight, and the larger the acquisition time is, the higher the weight of the data is, the higher the cost for acquiring the data is, and the longer the time should be cached for the data.
Step S220: and inserting the data of the cache structure to be inserted with the defined current data weight into a cache lower-level region in a basic cache data linked list structure.
Specifically, in this step, the to-be-inserted cache structure data with the defined current data weight is inserted into the head of line of the cache lower-level area in the basic cache data linked list structure.
In one embodiment, the cache low-level region in the basic cache data linked list structure comprises a first preset specific number of low-level region cache bits, and the low-level region cache bits are arranged in sequence;
specifically, in this embodiment, the first preset specific number is 6.
Step S320: according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure; then also comprises the following steps:
step S321: according to the cache low-level region arrangement instruction, arranging the cached data hit by the data to be inserted into the cache structure at a low-level region cache bit at a first position in the cache low-level region;
specifically, as shown in fig. 4, (x, 1) is inserted into the lower-level-section cache bit at the first position in the cache lower-level section when (x, 1) is newly inserted.
Step S322: after the cached data hit by the data to be inserted into the cache structure are arranged in the cache bit of the lower-level area at the first position in the cache lower-level area, judging whether the number of cache bits occupied by the cached data in the cache lower-level area exceeds the first preset specific number;
specifically, when (x, 1) is newly inserted, after (x, 1) is inserted into a lower-level region cache bit at a first position in a cache lower-level region, it is necessary to determine whether the number of cache bits occupied by cached data in the cache lower-level region exceeds the first preset specific number.
In this step, it is determined whether the number of cache bits occupied by the cached data in the cache lower-level region exceeds 6.
Step S323: if the cache data is judged to be the data in the last lower-level region cache bit in the cache lower-level region, the cached data in the last lower-level region cache bit in the cache lower-level region is removed from the cache lower-level region.
Specifically, in the present embodiment, the data in the low-level buffer bits is full, so the last bit in the low-level buffer bits, i.e. the queue tail (e, 4) in fig. 2, is removed at this time, and after removal, the state becomes as shown in fig. 4.
In one embodiment, the cache higher-level region in the basic cache data linked list structure comprises a second preset specific number of higher-level region cache bits, and the higher-level region cache bits are arranged in sequence;
step S400: if so, inserting the cached data hit by the data to be inserted into the cache structure into a cache higher-level area in the basic cache data linked list structure, specifically comprising:
if the cache structure data is judged to be the cache structure data, the cache data hit by the cache structure data to be inserted is inserted into the cache bit of the higher-level area of the cache higher-level area, wherein the cache bit is positioned at the first position.
Specifically, the cached data to be inserted into the cache structure data hit is inserted into the higher level region cache bit at the first position in the cache higher level region, that is, the high-weight data is placed in the higher level region only when the cache hits again, so that the high-weight + hot data is cached in the cache higher level region, that is, the cache needs to be cached for a longer time. And further, efficient cache management is achieved.
In one embodiment, as shown in fig. 5, a cache eviction apparatus with weight determination is provided, the apparatus comprising:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judgment module is used for judging whether cached data in a prestored basic cache data linked list structure are hit or not based on the acquired to-be-inserted cache structure data; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
the second judgment module is used for judging whether the weight of cached data included in the cached data hit by the cache structure data to be inserted is greater than a preset first standard weight or not if the judgment is yes;
the data inserting module is used for inserting the cache data hit by the cache structure data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight eliminating instruction if the judgment result is yes;
and the redundant data removing module is used for removing the cached data at the tail end of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
In one embodiment, the second determining module further comprises:
the low-level area arrangement instruction generation module is used for generating a cache low-level area arrangement instruction if the judgment result is negative;
and the cache low-level region data arrangement module is used for arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure according to the cache low-level region arrangement instruction.
In one embodiment, the first determining module further comprises:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the judgment result is negative;
and the new data inserting module is used for inserting the cache structure data to be inserted with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
In one embodiment, the cache low-level region data placement module further comprises the following modules:
the cache structure data hit module is used for inserting the cache structure data into the cache low-level region to be cached;
a cache bit data judging module, configured to judge whether a number of cache bits occupied by cached data in the cache low-level region exceeds the first preset specific number after the cached data hit by the data to be inserted into the cache structure is placed in a cache bit of the low-level region at a first position in the cache low-level region;
and the cache low-level region end data removing module is used for removing the cached data in the cache bit of the last low-level region in the cache low-level region from the cache low-level region if the cache low-level region end data removing module judges that the cache bit of the last low-level region in the cache low-level region is the last cache bit in the cache low-level region.
In one embodiment, the data insertion module is further configured to perform the steps of:
if the cache structure data is judged to be the cache structure data, the cache data hit by the cache structure data to be inserted is inserted into the cache bit of the higher-level area of the cache higher-level area, wherein the cache bit is positioned at the first position.
In one embodiment, the first determining module is further configured to perform the following steps:
and acquiring a pre-stored basic cache data linked list structure, wherein the basic cache data linked list structure is realized based on a double structure of a bidirectional linked list and a hash table.
Segmenting the basic cache data linked list structure and dividing the basic cache data linked list structure into a cache high-level area and a cache low-level area;
and storing data into the cache high-level area and the cache low-level area, wherein the stored data are the cached data, and each cached data comprises a cached data identifier and a cached data weight.
In one embodiment, as shown in fig. 6, a computer device includes a memory and a processor, the memory stores a computer program, and the processor implements the steps of the above-mentioned cache eviction method with weight determination when executing the computer program.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the above-mentioned cache eviction method with weight determination.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A cache elimination method with weight judgment is characterized by comprising the following steps:
step S100: obtaining cache structure data to be inserted;
step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight;
step S400: if so, inserting the cache data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
step S500: and removing the tail-most cached data of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
2. The method for cache eviction with weight determination as claimed in claim 1, wherein the step S300: if so, judging whether the weight of cached data included in the cached data hit by the data to be inserted into the cache structure is greater than a preset first standard weight; then also comprises the following steps:
step S310: if not, generating a cache low-level region arrangement instruction;
step S320: and according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure.
3. The method for cache eviction with weight determination as claimed in claim 1, wherein the step S200: judging whether the cached data in a pre-stored basic cache data linked list structure is hit or not based on the obtained cache structure data to be inserted; then, the method further comprises the following steps:
step S210: if not, self-defining the current data weight of the cache structure data to be inserted;
step S220: and inserting the data of the cache structure to be inserted with the defined current data weight into a cache lower-level region in a basic cache data linked list structure.
4. The method of claim 3, wherein the cache low-level regions in the underlying cache data linked list structure comprise a first predetermined specific number of low-level region cache bits, and wherein the low-level region cache bits are arranged in sequence;
step S320: according to the cache low-level region arrangement instruction, arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure; then also comprises the following steps:
step S321: according to the cache low-level region arrangement instruction, arranging the cached data hit by the data to be inserted into the cache structure at a low-level region cache bit at a first position in the cache low-level region;
step S322: after the cached data hit by the data to be inserted into the cache structure are arranged in the cache bit of the lower-level area at the first position in the cache lower-level area, judging whether the number of cache bits occupied by the cached data in the cache lower-level area exceeds the first preset specific number;
step S323: if the cache data is judged to be the data in the last lower-level region cache bit in the cache lower-level region, the cached data in the last lower-level region cache bit in the cache lower-level region is removed from the cache lower-level region.
5. The method for cache eviction with weight determination as claimed in claim 1, wherein the cache higher-level regions in the basic cache data linked list structure comprise a second preset specific number of higher-level region cache bits, and the higher-level region cache bits are arranged in sequence;
step S400: if so, inserting the cached data hit by the data to be inserted into the cache structure into a cache higher-level area in the basic cache data linked list structure, specifically comprising:
if the cache structure data is judged to be the cache structure data, the cache data hit by the cache structure data to be inserted is inserted into the cache bit of the higher-level area of the cache higher-level area, wherein the cache bit is positioned at the first position.
6. A cache elimination device with weight judgment is characterized by comprising:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judgment module is used for judging whether cached data in a prestored basic cache data linked list structure are hit or not based on the acquired to-be-inserted cache structure data; the basic cache data linked list structure comprises a cache high-level area and a cache low-level area, wherein cached data are cached in the cache high-level area and the cache low-level area in sequence, and each cached data comprises a cached data identifier and a cached data weight;
the second judgment module is used for judging whether the weight of cached data included in the cached data hit by the cache structure data to be inserted is greater than a preset first standard weight or not if the judgment is yes;
the data inserting module is used for inserting the cache data hit by the cache structure data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight eliminating instruction if the judgment result is yes;
and the redundant data removing module is used for removing the cached data at the tail end of the cache low-level region in the basic cache data linked list structure from the basic cache data linked list structure based on the low-weight elimination instruction.
7. The cache eviction device with weight determination of claim 6, wherein the second determination module further comprises:
the low-level area arrangement instruction generation module is used for generating a cache low-level area arrangement instruction if the judgment result is negative;
and the cache low-level region data arrangement module is used for arranging the cache data hit by the data to be inserted into the cache structure in a cache low-level region in a basic cache data linked list structure according to the cache low-level region arrangement instruction.
8. The cache eviction device with weight determination of claim 6, wherein the first determining module further comprises:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the judgment result is negative;
and the new data inserting module is used for inserting the cache structure data to be inserted with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 5 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 5.
CN202110080339.7A 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment Active CN112764681B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110080339.7A CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110080339.7A CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Publications (2)

Publication Number Publication Date
CN112764681A true CN112764681A (en) 2021-05-07
CN112764681B CN112764681B (en) 2024-02-13

Family

ID=75703582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110080339.7A Active CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Country Status (1)

Country Link
CN (1) CN112764681B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153760A (en) * 2021-12-02 2022-03-08 北京乐讯科技有限公司 Method, system and storage medium for eliminating healthy value storage cache based on weight

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5822759A (en) * 1996-11-22 1998-10-13 Versant Object Technology Cache system
CN105975402A (en) * 2016-04-28 2016-09-28 华中科技大学 Caching method and system for eliminated data perception in hybrid memory environment
CN108334460A (en) * 2017-05-25 2018-07-27 中兴通讯股份有限公司 data cache method and device
CN111159066A (en) * 2020-01-07 2020-05-15 杭州电子科技大学 Dynamically-adjusted cache data management and elimination method
CN111708720A (en) * 2020-08-20 2020-09-25 北京思明启创科技有限公司 Data caching method, device, equipment and medium
CN111722797A (en) * 2020-05-18 2020-09-29 西安交通大学 SSD and HA-SMR hybrid storage system oriented data management method, storage medium and device
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5822759A (en) * 1996-11-22 1998-10-13 Versant Object Technology Cache system
CN105975402A (en) * 2016-04-28 2016-09-28 华中科技大学 Caching method and system for eliminated data perception in hybrid memory environment
CN108334460A (en) * 2017-05-25 2018-07-27 中兴通讯股份有限公司 data cache method and device
CN111159066A (en) * 2020-01-07 2020-05-15 杭州电子科技大学 Dynamically-adjusted cache data management and elimination method
CN111722797A (en) * 2020-05-18 2020-09-29 西安交通大学 SSD and HA-SMR hybrid storage system oriented data management method, storage medium and device
CN111708720A (en) * 2020-08-20 2020-09-25 北京思明启创科技有限公司 Data caching method, device, equipment and medium
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
詹玲 等: "SHCA:基于RAID的两级缓存算法设计与实现", 小型微型计算机系统, no. 05, 31 May 2017 (2017-05-31), pages 1152 - 1157 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153760A (en) * 2021-12-02 2022-03-08 北京乐讯科技有限公司 Method, system and storage medium for eliminating healthy value storage cache based on weight

Also Published As

Publication number Publication date
CN112764681B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN109597571B (en) Data storage method, data reading method, data storage device, data reading device and computer equipment
US7925822B2 (en) Erase count recovery
US10007615B1 (en) Methods and apparatus for performing fast caching
US7793071B2 (en) Method and system for reducing cache conflicts
CN112764681A (en) Cache elimination method and device with weight judgment function and computer equipment
CN112153170B (en) Method, device and equipment for accessing server and storage medium
EP2851810A1 (en) Management of a memory
CN104156323B (en) A kind of adaptive read method of the data block length of cache memory and device
CN111367833A (en) Data caching method and device, computer equipment and readable storage medium
CN108829616A (en) A kind of data cached management method, device, computer equipment and storage medium
CN112183746A (en) Neural network pruning method, system and device for sensitivity analysis and reinforcement learning
CN112528098A (en) Data query method, system, electronic equipment and storage medium
US20050216693A1 (en) System for balancing multiple memory buffer sizes and method therefor
CN113268440B (en) Cache elimination method and system
CN115277678B (en) File downloading method, device, computer equipment and storage medium
CN111190737A (en) Memory allocation method for embedded system
US20200133863A1 (en) Correlated addresses and prefetching
CN114153760B (en) Method, system and storage medium for eliminating healthy value storage cache based on weight
CN115203072A (en) File pre-reading cache allocation method and device based on access heat
CN115905323B (en) Searching method, device, equipment and medium suitable for various searching strategies
CN110688084B (en) First-in first-out FLASH data storage method, system and terminal
CN110442854B (en) Report generation method and device, computer equipment and readable storage medium
CN112988664B (en) Data archiving method, device, equipment and storage medium
CN113010338B (en) Error leakage threshold value adjusting method, device, equipment and medium of memory CE
CN111132215B (en) Hot spot area positioning method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant