CN112764681B - Cache elimination method and device with weight judgment and computer equipment - Google Patents

Cache elimination method and device with weight judgment and computer equipment Download PDF

Info

Publication number
CN112764681B
CN112764681B CN202110080339.7A CN202110080339A CN112764681B CN 112764681 B CN112764681 B CN 112764681B CN 202110080339 A CN202110080339 A CN 202110080339A CN 112764681 B CN112764681 B CN 112764681B
Authority
CN
China
Prior art keywords
cache
data
low
level region
weight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110080339.7A
Other languages
Chinese (zh)
Other versions
CN112764681A (en
Inventor
郭浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Qiniu Information Technology Co ltd
Original Assignee
Shanghai Qiniu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Qiniu Information Technology Co ltd filed Critical Shanghai Qiniu Information Technology Co ltd
Priority to CN202110080339.7A priority Critical patent/CN112764681B/en
Publication of CN112764681A publication Critical patent/CN112764681A/en
Application granted granted Critical
Publication of CN112764681B publication Critical patent/CN112764681B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/0604Improving or facilitating administration, e.g. storage management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0629Configuration or reconfiguration of storage systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0646Horizontal data movement in storage systems, i.e. moving data in between storage devices or systems
    • G06F3/0647Migration mechanisms
    • G06F3/0649Lifecycle management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0655Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
    • G06F3/0656Data buffering arrangements
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The application relates to a cache elimination method with weight judgment, a device and computer equipment, wherein the method comprises the steps of obtaining cache structure data to be inserted; judging whether the pre-stored cached data is hit or not based on the to-be-inserted cache structure data; if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not; if yes, inserting the cached data into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; and removing the end cached data of the low-level area based on the low-weight elimination instruction. According to the invention, on one hand, the data to be cached is cached through the weight, and the data with high weight is cached for a long time through the weight setting, and on the other hand, the LRU algorithm is also considered, and when the cached data is accessed again, the cached data is cached for a longer time, so that the fine control of the cache is realized, and the high-efficiency cache is realized.

Description

Cache elimination method and device with weight judgment and computer equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, and a computer device for cache elimination with weight determination.
Background
The common cache elimination method is various, for example, the patent of the invention with the application number of CN201610720506.9 discloses a real-time adjustment method and device of the cache elimination strategy, the service is sampled according to different cache elimination strategies, and the cache hit rate of cache data of each cache elimination strategy is counted in real time; calculating a switching overhead factor for switching the current cache elimination strategy into other cache elimination strategies according to the cache hit rate; and when the switching overhead factor is smaller than a preset threshold value, switching the current cache elimination strategy.
Although, the technical scheme disclosed in the above patent document can adjust the method and the device in real time, by feeding back the real-time hit rate of the cache data, the characteristic parameters in the cache algorithm are dynamically adjusted, so that the adaptability of the cache algorithm can be effectively improved, the hit rate of the cache is increased, and the performance of the whole system is improved. It still has significant drawbacks such as being unable to achieve long and efficient cache management for cached data.
Therefore, the cache elimination method on the market at present has the technical problems that the cache time is short and the high-efficiency cache management cannot be realized.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a cache elimination method, apparatus, and computer device with weight determination that can cache data and realize long-time and efficient cache management.
The technical scheme of the invention is as follows:
a cache elimination method with weight judgment comprises the following steps:
step S100: obtaining cache structure data to be inserted;
step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not;
step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
step S500: and removing the buffered data at the end of the buffer low-level zone in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
Specifically, step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not; and then further comprises:
step S310: if not, generating a cache low-level region setting instruction;
step S320: and according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure.
Specifically, step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; thereafter, the method further comprises:
step S210: if not, customizing the current data weight of the cache structure data to be inserted;
step S220: and inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
Specifically, the low-level cache region in the basic cache data linked list structure comprises a first preset specific number of low-level cache bits, and the low-level cache bits are sequentially arranged;
step S320: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure; and then further comprises:
step S321: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data command in a low-level region cache bit at a first position in the cache low-level region;
step S322: judging whether the number of the buffer bits occupied by the buffered data in the buffer low-level region exceeds the first preset specific number after the buffered data to be inserted into the buffer structure data is arranged in the buffer bit of the low-level region in the first position in the buffer low-level region;
step S323: and if so, removing the cached data in the last low-level region cache bit in the low-level region cache from the low-level region cache.
Specifically, the cache high-level region in the basic cache data linked list structure comprises a second preset specific number of high-level region cache bits, and the cache bits of each high-level region are sequentially arranged;
step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, wherein the method specifically comprises the following steps:
and if so, inserting the cached data to be inserted into the cache structure data hit into a high-level area cache bit at a first position in the cache high-level area.
A cache elimination device with weight determination, the device comprising:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judging module is used for judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
the second judging module is used for judging whether the cached data weight included in the cached data hit by the cache structure data to be inserted is larger than a preset first standard weight or not if the cached data hit by the cache structure data to be inserted is judged to be yes;
the data insertion module is used for inserting the cached data hit by the data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight elimination instruction if the data to be inserted into the cache structure data hit is judged to be positive;
and the redundant data removing module is used for removing the buffered data of the end of the buffer low-level region in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
Specifically, the second judging module further includes:
the low-level region setting instruction generation module is used for generating a cache low-level region setting instruction if the judgment is negative;
and the cache low-level region data placement module is used for placing the cached data to be inserted into the cache structure data hit in the cache low-level region in the basic cache data linked list structure according to the cache low-level region placement instruction.
Specifically, the first judging module further includes:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the current data weight is judged to be negative;
and the new data insertion module is used for inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in the basic cache data chain list structure.
A computer device comprising a memory and a processor, said memory storing a computer program, said processor implementing the steps of the above-described cache elimination method with weight determination when executing said computer program.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the above-described cache elimination method with weight determination.
The invention has the following technical effects:
the method, the device and the computer equipment for eliminating the cache with weight judgment firstly acquire the data of the structure to be inserted into the cache; judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight; if yes, judging whether cached data weight included in the cached data hit by the cache structure data to be inserted is larger than a preset first standard weight; then if yes, inserting the cached data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; and finally, removing the buffered data of the end of the buffer low-level region in the basic buffer data chain table structure from the basic buffer data chain table structure based on the low-weight elimination instruction, on one hand, buffering the data to be buffered through weight, and on the other hand, realizing the data buffering of high weight for a long time through weight setting, and on the other hand, also taking into account the LRU algorithm, buffering the buffered data for a longer time when the buffered data is accessed again, thereby realizing the fine control of buffering and realizing high-efficiency buffering.
Drawings
FIG. 1 is a flow chart of a method for cache elimination with weight determination in one embodiment;
FIG. 2 is a diagram of stored data in a basic cache data linked list structure in one embodiment;
FIG. 3 is a schematic diagram of stored data in a basic cache data link list structure according to another embodiment;
FIG. 4 is a diagram of data already stored in a basic cache data linked list structure after a new insertion of data in one embodiment;
FIG. 5 is a block diagram of a cache elimination apparatus with weight determination according to an embodiment;
fig. 6 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a cache elimination method with weight judgment, the method including:
step S100: obtaining cache structure data to be inserted;
specifically, the data to be inserted into the cache structure is data to be cached. Specifically, there are two cases where the cache structure data to be inserted is new data, and the other is already cached data.
Step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
specifically, in this step, the structure of the basic cache data linked list is preset; specifically, before the step S200, the method further includes:
step S201: and acquiring a pre-stored basic cache data link list structure, wherein the basic cache data link list structure is realized based on a double structure of a double link list and a hash table.
Specifically, the pre-stored basic cache data linked list structure is based on the double structures of the double linked list and the hash table, so that the double linked list is fully utilized for conveniently inserting and removing nodes, whether hit is found in the cache table or not is conveniently achieved by utilizing the hash table, and further the cache management efficiency is improved.
Step S202: segmenting the basic cache data linked list structure, and dividing the basic cache data linked list structure into a cache high-level region and a cache low-level region;
in this embodiment, the cache high-level area and the cache low-level area are shown in fig. 2, that is, in this embodiment, the basic cache data linked list structure is divided into two sections, which are the cache high-level area and the cache low-level area respectively.
Specifically, as shown in fig. 2, the areas where a8, s9, d7, f6, g8 and h9 are located are the cache advanced areas. The areas where J3, k5, l2, q1, w3 and e4 are located are the cache low-level areas.
Further, the cached data a, s, d, f, g and h in the cache high-level region and the cache low-level region are identified by the cached data, and the numbers 8, 9, 7, 6, 8 and 9 carried subsequently are the cached data weights.
Step S203: and storing data into the cache high-level region and the cache low-level region, wherein the stored data are the cached data, and each cached data comprises a cached data identifier and a cached data weight.
Further, in this step, based on the obtained data to be inserted into the cache structure, it is determined whether the cached data in the pre-stored basic cache data linked list structure is hit, specifically as follows:
as shown in fig. 3, the stored data in the data linked list structure is cached for the base.
The data sources in fig. 3 are:
when the data y is newly inserted, and the weight of y is 8, the data (y, 8) is inserted into the queue head of the low-level cache area, and the data is shown in fig. 3 after the data is inserted.
Next, as shown in fig. 4, if the data of the to-be-inserted cache structure is also (y, 8), then, based on the obtained data of the to-be-inserted cache structure, it is determined whether the cached data in the pre-stored basic cache data linked list structure is hit, that is, based on the obtained (y, 8), it is determined whether the cached data in the basic cache data linked list structure shown in fig. 3 is hit. Next, step S300 is performed.
Step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not;
specifically, in this step, if yes, it is determined that the cached data in the basic cache data link list structure is hit pre-stored based on the obtained data of the cache structure to be inserted.
The preset first standard weight is preset in a customized mode, when the preset first standard weight is larger than the preset first standard weight, the fact that the weight is higher is indicated, and data larger than the preset first standard weight are required to be cached for a long time, namely the data are placed in the cache advanced region.
In this step, if yes, that is, to determine that the data (y, 8) to be inserted into the buffer structure is the buffered data, then determine whether the buffered data weight of the data (y, 8) to be inserted into the buffer structure is greater than the preset first standard weight.
Step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
specifically, if yes, it is determined that the cached data weight included in the cached data hit by the to-be-inserted cache structure data is greater than a preset first standard weight, so that the cached data hit by the to-be-inserted cache structure data needs to be inserted into a cache high-level region in the basic cache data linked list structure at this time, that is, the to-be-inserted cache structure data (y, 8) is inserted into the cache high-level region in the basic cache data linked list structure.
Further, in this embodiment, the data (y, 8) to be inserted into the cache structure is inserted into the queue head of the cache high-level region in the basic cache data linked list structure, where the queue head is the first cache bit of the cache high-level region.
Step S500: and removing the buffered data at the end of the buffer low-level zone in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
Specifically, the method comprises the steps of firstly obtaining the structure data to be inserted into the cache; judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight; if yes, judging whether cached data weight included in the cached data hit by the cache structure data to be inserted is larger than a preset first standard weight; then if yes, inserting the cached data hit by the data to be inserted into the cache structure into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction; and finally, removing the buffered data of the end of the buffer low-level region in the basic buffer data chain table structure from the basic buffer data chain table structure based on the low-weight elimination instruction, on one hand, buffering the data to be buffered through weight, and on the other hand, realizing the data buffering of high weight for a long time through weight setting, and on the other hand, also taking into account the LRU algorithm, buffering the buffered data for a longer time when the buffered data is accessed again, thereby realizing the fine control of buffering and realizing high-efficiency buffering.
In one embodiment, step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not; and then further comprises:
step S310: if not, generating a cache low-level region setting instruction;
specifically, in order to determine that the cached data hit by the to-be-inserted cache structure data includes cached data weight not greater than a preset first standard weight, at this time, the to-be-inserted cache structure data does not need long-term caching, so that a cache low-level region placement instruction is generated.
Step S320: and according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure.
Specifically, when the data to be inserted into the cache structure does not need long-term cache, but occurs again, so that the cached data hit by the data to be inserted into the cache structure is placed in a cache low-level region in the basic cache data linked list structure.
Further, the cached data to be inserted into the cache structure data hit is arranged at the queue head of the cache low-level region in the basic cache data linked list structure, namely the first cache bit in the cache low-level region.
In one embodiment, step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; thereafter, the method further comprises:
step S210: if not, customizing the current data weight of the cache structure data to be inserted;
specifically, if the data to be inserted is not the data to be inserted, the data to be inserted is not the cached data in the basic cache data linked list structure, and the data to be inserted is the new data and needs to be redefined and cached.
Further, before caching, the current data weight of the data to be inserted into the cache structure needs to be customized. For example, the time t for acquiring the data is taken as the weight, and the larger the acquisition time is, the higher the data weight is, the higher the cost for acquiring the data is, and the longer the data should be cached.
Step S220: and inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
Specifically, in this step, the data of the to-be-inserted cache structure, in which the current data weight has been defined, is inserted into the queue head of the cache low-level region in the base cache data linked list structure.
In one embodiment, the low-level cache region in the basic cache data linked list structure comprises a first preset specific number of low-level cache bits, and the low-level cache bits are sequentially arranged;
specifically, in this embodiment, the first preset specific number is 6.
Step S320: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure; and then further comprises:
step S321: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data command in a low-level region cache bit at a first position in the cache low-level region;
specifically, as shown in fig. 4, when (x, 1) is newly inserted, (x, 1) is inserted into the lower-level region cache bit at the first position in the cache lower-level region.
Step S322: judging whether the number of the buffer bits occupied by the buffered data in the buffer low-level region exceeds the first preset specific number after the buffered data to be inserted into the buffer structure data is arranged in the buffer bit of the low-level region in the first position in the buffer low-level region;
specifically, when (x, 1) is newly inserted, after (x, 1) is inserted into the low-level region buffer bit in the first position in the low-level region of the buffer, it is required to determine whether the number of buffer bits occupied by the buffered data in the low-level region of the buffer exceeds the first preset specific number.
In this step, it is determined whether the number of buffered bits occupied by the buffered data in the lower level region exceeds 6.
Step S323: and if so, removing the cached data in the last low-level region cache bit in the low-level region cache from the low-level region cache.
Specifically, in this embodiment, the data in the lower-level cache bit is full, so that the last bit in the lower-level cache bit, i.e. the tail (e, 4) in fig. 2, is removed, and then becomes as shown in fig. 4.
In one embodiment, the cache high-level region in the basic cache data linked list structure comprises a second preset specific number of high-level region cache bits, and the high-level region cache bits are sequentially arranged;
step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, wherein the method specifically comprises the following steps:
and if so, inserting the cached data to be inserted into the cache structure data hit into a high-level area cache bit at a first position in the cache high-level area.
Specifically, the cached data to be inserted into the cache structure data hit is inserted into the cache bit of the high-level area in the first position in the cache high-level area, that is, the high-weight data is placed in the high-level area only when the data hit is re-hit in the cache, so that the high-weight and hot data are cached in the cache high-level area, that is, the cache is required for a longer time. Thereby realizing high-efficiency cache management.
In one embodiment, as shown in fig. 5, a cache elimination device with weight judgment is provided, where the device includes:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judging module is used for judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
the second judging module is used for judging whether the cached data weight included in the cached data hit by the cache structure data to be inserted is larger than a preset first standard weight or not if the cached data hit by the cache structure data to be inserted is judged to be yes;
the data insertion module is used for inserting the cached data hit by the data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight elimination instruction if the data to be inserted into the cache structure data hit is judged to be positive;
and the redundant data removing module is used for removing the buffered data of the end of the buffer low-level region in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
In one embodiment, the second determining module further includes:
the low-level region setting instruction generation module is used for generating a cache low-level region setting instruction if the judgment is negative;
and the cache low-level region data placement module is used for placing the cached data to be inserted into the cache structure data hit in the cache low-level region in the basic cache data linked list structure according to the cache low-level region placement instruction.
In one embodiment, the first determining module further includes:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the current data weight is judged to be negative;
and the new data insertion module is used for inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in the basic cache data chain list structure.
In one embodiment, the cache low-level region data placement module further includes the following modules:
the low-level region cache bit data placement module is used for placing the cached data to be inserted into the cache structure data command in a low-level region cache bit at a first position in the cache low-level region according to the cache low-level region placement instruction;
the buffer bit data judging module is used for judging whether the number of buffer bits occupied by the buffered data in the buffer low-level region exceeds the first preset specific number after the buffered data to be inserted into the buffer structure data hit is arranged in the buffer bit of the low-level region in the first position in the buffer low-level region;
and the cache low-level region end data removing module is used for removing the cached data in the last low-level region cache bit in the cache low-level region from the cache low-level region if the terminal data removing module judges that the terminal data is in the cache low-level region.
In one embodiment, the data insertion module is further configured to perform the steps of:
and if so, inserting the cached data to be inserted into the cache structure data hit into a high-level area cache bit at a first position in the cache high-level area.
In one embodiment, the first judging module is further configured to perform the following steps:
and acquiring a pre-stored basic cache data link list structure, wherein the basic cache data link list structure is realized based on a double structure of a double link list and a hash table.
Segmenting the basic cache data linked list structure, and dividing the basic cache data linked list structure into a cache high-level region and a cache low-level region;
and storing data into the cache high-level region and the cache low-level region, wherein the stored data are the cached data, and each cached data comprises a cached data identifier and a cached data weight.
In one embodiment, as shown in fig. 6, a computer device includes a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the cache elimination method with weight determination described above when executing the computer program.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the above-described cache elimination method with weight determination.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the various embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), memory bus direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (10)

1. The method for eliminating the cache with the weight judgment is characterized by comprising the following steps of:
step S100: obtaining cache structure data to be inserted;
step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not;
step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, and generating a low-weight elimination instruction;
step S500: and removing the buffered data at the end of the buffer low-level zone in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
2. The method for cache elimination with weight judgment according to claim 1, wherein step S300: if yes, judging whether cached data weight included in the cached data hit to be inserted into the cache structure data is larger than a preset first standard weight or not; and then further comprises:
step S310: if not, generating a cache low-level region setting instruction;
step S320: and according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure.
3. The method for cache elimination with weight judgment according to claim 1, wherein step S200: judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; thereafter, the method further comprises:
step S210: if not, customizing the current data weight of the cache structure data to be inserted;
step S220: and inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in a basic cache data linked list structure.
4. The method for eliminating cache with weight judgment according to claim 3, wherein the cache lower level region in the basic cache data link list structure comprises a first preset specific number of lower level region cache bits, and the lower level region cache bits are sequentially arranged;
step S320: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data hit in a cache low-level region in a basic cache data linked list structure; and then further comprises:
step S321: according to the cache low-level region arranging instruction, arranging the cached data to be inserted into the cache structure data command in a low-level region cache bit at a first position in the cache low-level region;
step S322: judging whether the number of the buffer bits occupied by the buffered data in the buffer low-level region exceeds the first preset specific number after the buffered data to be inserted into the buffer structure data is arranged in the buffer bit of the low-level region in the first position in the buffer low-level region;
step S323: and if so, removing the cached data in the last low-level region cache bit in the low-level region cache from the low-level region cache.
5. The method for eliminating cache with weight judgment according to claim 1, wherein the cache high-level area in the basic cache data link list structure comprises a second preset specific number of high-level area cache bits, and the high-level area cache bits are sequentially arranged;
step S400: if yes, inserting the cached data to be inserted into the cache structure data hit into a cache high-level area in the basic cache data linked list structure, wherein the method specifically comprises the following steps:
and if so, inserting the cached data to be inserted into the cache structure data hit into a high-level area cache bit at a first position in the cache high-level area.
6. A cache elimination device with weight judgment, characterized in that the device comprises:
the cache structure data acquisition module is used for acquiring cache structure data to be inserted;
the first judging module is used for judging whether cached data in a pre-stored basic cache data linked list structure is hit or not based on the acquired data to be inserted into the cache structure; the basic cache data chain table structure comprises a cache high-level region and a cache low-level region, wherein the cache high-level region and the cache low-level region are respectively and sequentially cached data, and each cached data comprises a cached data identifier and a cached data weight;
the second judging module is used for judging whether the cached data weight included in the cached data hit by the cache structure data to be inserted is larger than a preset first standard weight or not if the cached data hit by the cache structure data to be inserted is judged to be yes;
the data insertion module is used for inserting the cached data hit by the data to be inserted into the cache high-level region in the basic cache data linked list structure and generating a low-weight elimination instruction if the data to be inserted into the cache structure data hit is judged to be positive;
and the redundant data removing module is used for removing the buffered data of the end of the buffer low-level region in the basic buffer data linked list structure from the basic buffer data linked list structure based on the low-weight elimination instruction.
7. The cache elimination device with weight judgment according to claim 6, wherein the second judgment module further comprises:
the low-level region setting instruction generation module is used for generating a cache low-level region setting instruction if the judgment is negative;
and the cache low-level region data placement module is used for placing the cached data to be inserted into the cache structure data hit in the cache low-level region in the basic cache data linked list structure according to the cache low-level region placement instruction.
8. The cache elimination device with weight judgment according to claim 6, wherein the first judgment module further comprises:
the current data weight self-defining module is used for self-defining the current data weight of the cache structure data to be inserted if the current data weight is judged to be negative;
and the new data insertion module is used for inserting the data of the to-be-inserted cache structure with the defined current data weight into a cache low-level region in the basic cache data chain list structure.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202110080339.7A 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment Active CN112764681B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110080339.7A CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110080339.7A CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Publications (2)

Publication Number Publication Date
CN112764681A CN112764681A (en) 2021-05-07
CN112764681B true CN112764681B (en) 2024-02-13

Family

ID=75703582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110080339.7A Active CN112764681B (en) 2021-01-21 2021-01-21 Cache elimination method and device with weight judgment and computer equipment

Country Status (1)

Country Link
CN (1) CN112764681B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153760B (en) * 2021-12-02 2022-07-29 北京乐讯科技有限公司 Method, system and storage medium for eliminating healthy value storage cache based on weight

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5822759A (en) * 1996-11-22 1998-10-13 Versant Object Technology Cache system
CN105975402A (en) * 2016-04-28 2016-09-28 华中科技大学 Caching method and system for eliminated data perception in hybrid memory environment
CN108334460A (en) * 2017-05-25 2018-07-27 中兴通讯股份有限公司 data cache method and device
CN111159066A (en) * 2020-01-07 2020-05-15 杭州电子科技大学 Dynamically-adjusted cache data management and elimination method
CN111708720A (en) * 2020-08-20 2020-09-25 北京思明启创科技有限公司 Data caching method, device, equipment and medium
CN111722797A (en) * 2020-05-18 2020-09-29 西安交通大学 SSD and HA-SMR hybrid storage system oriented data management method, storage medium and device
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5822759A (en) * 1996-11-22 1998-10-13 Versant Object Technology Cache system
CN105975402A (en) * 2016-04-28 2016-09-28 华中科技大学 Caching method and system for eliminated data perception in hybrid memory environment
CN108334460A (en) * 2017-05-25 2018-07-27 中兴通讯股份有限公司 data cache method and device
CN111159066A (en) * 2020-01-07 2020-05-15 杭州电子科技大学 Dynamically-adjusted cache data management and elimination method
CN111722797A (en) * 2020-05-18 2020-09-29 西安交通大学 SSD and HA-SMR hybrid storage system oriented data management method, storage medium and device
CN111708720A (en) * 2020-08-20 2020-09-25 北京思明启创科技有限公司 Data caching method, device, equipment and medium
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SHCA:基于RAID的两级缓存算法设计与实现;詹玲 等;小型微型计算机系统;20170531(05);1152-1157 *

Also Published As

Publication number Publication date
CN112764681A (en) 2021-05-07

Similar Documents

Publication Publication Date Title
CN103257831B (en) The read/writing control method of memorizer and the memorizer of correspondence
CN112764681B (en) Cache elimination method and device with weight judgment and computer equipment
US7793071B2 (en) Method and system for reducing cache conflicts
CN107368437B (en) Last-level cache management method and system
EP2851810A1 (en) Management of a memory
CN111966299A (en) Wear leveling method and device for Nand Flash
CN112183746A (en) Neural network pruning method, system and device for sensitivity analysis and reinforcement learning
CN112528098A (en) Data query method, system, electronic equipment and storage medium
CN108829616A (en) A kind of data cached management method, device, computer equipment and storage medium
CN117273165A (en) Network model fine-tuning method, system and equipment suitable for community scene
CN115905323B (en) Searching method, device, equipment and medium suitable for various searching strategies
CN113268440B (en) Cache elimination method and system
US20190377674A1 (en) Data storage device with wear range optimization
CN112306417B (en) Method for shortening power-on recovery time of SSD
CN110502457B (en) Metadata storage method and device
CN113434438B (en) Method for prolonging FLASH write-in life of smart card
CN113742304B (en) Data storage method of hybrid cloud
CN115203072A (en) File pre-reading cache allocation method and device based on access heat
CN111723834B (en) Voice deep learning training method and device
CN110688084A (en) First-in first-out FLASH data storage method, system and terminal
CN113608679A (en) File storage method, device, equipment and computer readable storage medium
CN102130963A (en) Method, device and system for storing file in client network
CN113608675B (en) RAID data IO processing method and device, computer equipment and medium
CN110442854B (en) Report generation method and device, computer equipment and readable storage medium
CN110889524A (en) Power failure reservation application processing method and device, computer equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant