CN116257181A - Cache data cleaning method and device, computing equipment and storage medium - Google Patents

Cache data cleaning method and device, computing equipment and storage medium Download PDF

Info

Publication number
CN116257181A
CN116257181A CN202310007415.0A CN202310007415A CN116257181A CN 116257181 A CN116257181 A CN 116257181A CN 202310007415 A CN202310007415 A CN 202310007415A CN 116257181 A CN116257181 A CN 116257181A
Authority
CN
China
Prior art keywords
cache
request
data
processing
source data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310007415.0A
Other languages
Chinese (zh)
Inventor
张�杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan Chezhiyi Communication Information Technology Co ltd
Original Assignee
Hainan Chezhiyi Communication Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hainan Chezhiyi Communication Information Technology Co ltd filed Critical Hainan Chezhiyi Communication Information Technology Co ltd
Priority to CN202310007415.0A priority Critical patent/CN116257181A/en
Publication of CN116257181A publication Critical patent/CN116257181A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0646Horizontal data movement in storage systems, i.e. moving data in between storage devices or systems
    • G06F3/0652Erasing, e.g. deleting, data cleaning, moving of data to a wastebasket
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/0802Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches
    • G06F12/0891Addressing of a memory level in which the access to the desired data or data block requires associative addressing means, e.g. caches using clearing, invalidating or resetting means
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a cache data cleaning method, a device, computing equipment and a storage medium, and relates to the technical field of cache. The method is performed in a computing device, comprising: responding to a request for acquiring source data, and judging whether the request contains a cache cleaning mark or not; if the cache cleaning mark is contained, recording the cache cleaning mark to a context for processing the request; and deleting the cache data corresponding to the source data for each processing function with the cache annotation for processing the request call, and executing the processing function. According to the technical scheme of the invention, the cache data of all source data on the request link can be automatically cleaned and updated, and the cleaning process of the cache data is simpler and more efficient.

Description

Cache data cleaning method and device, computing equipment and storage medium
Technical Field
The present invention relates to the field of cache technologies, and in particular, to a method, an apparatus, a computing device, and a storage medium for cleaning cache data.
Background
Caching is a primary means of improving the performance of a web site or application, and can avoid repeated creation, processing, and transmission of data. However, since the cached data is simply a snapshot of the actual data, the original cached data becomes stale once the data source is modified. Therefore, cleaning and updating are required for the cache data.
Currently, the difficulties of cleaning and updating the cache mainly include the following: (1) The cache content is generally scattered, a simple page interface request of a user may use a plurality of cached results, for example, for a vehicle model list, the name of a vehicle model and guidance price data may belong to one cache item, and the price reduction data may be another cache item; (2) Various caching modes, such as caching to a distributed system redis, caching to an application memory and the like; (3) With the advent of micro-service architecture, caches for the same piece of data may exist in different micro-services.
The existing method for cleaning and updating the Cache is usually to clean up the Cache data (such as the method for annotating the @ Cache) in Java code for each place, and the method is easy to understand, but is often complicated to implement, needs to write cleaning and updating codes for each place manually by modifying the source data, and has low efficiency. Specifically, the method faces complex system call, the logic is very complicated, and corresponding cleaning codes need to be developed every time a cache project is newly added; every time an application is added, the corresponding code for cleaning up the cache is also required to be added. This causes a lot of repetitive labor, and the possibility of occurrence of code bug increases due to the large workload.
Therefore, a method for cleaning cache data is needed to solve the problems in the above technical solutions.
Disclosure of Invention
Accordingly, the present invention provides a method and apparatus for cleaning cache data, which solve or at least alleviate the above-mentioned problems.
According to one aspect of the present invention, there is provided a cache data cleaning method, executed in a computing device, the method comprising: responding to a request for acquiring source data, and judging whether the request contains a cache cleaning mark or not; if the cache cleaning mark is contained, recording the cache cleaning mark to a context for processing the request; and deleting the cache data corresponding to the source data for each processing function with the cache annotation for processing the request call, and executing the processing function.
Optionally, in the method for cleaning cache data according to the present invention, deleting cache data corresponding to the source data includes: judging whether a cache cleaning mark exists in a context for processing the request; and if the cache cleaning mark exists, deleting the cache data corresponding to the source data.
Optionally, in the method for cleaning cache data according to the present invention, for each processing function with a cache annotation for processing the request call, deleting the cache data corresponding to the source data includes: judging whether a cache annotation exists in each processing function for processing the request call; if the cache annotation exists, judging whether a cache cleaning mark exists in the context for processing the request; and if the cache cleaning mark exists, deleting the cache data corresponding to the source data.
Optionally, in the method for cleaning cache data according to the present invention, when executing the processing function, the method includes: judging whether a third party service interface needs to be called or not; if yes, the cache cleaning mark is added to a third party request parameter, and the third party service interface is called based on the third party request parameter.
Optionally, in the cache data cleaning method according to the present invention, the computing device is respectively connected with a client and a management background in a communication manner; before responding to a request to acquire source data, comprising: receiving a request of a client for acquiring source data; and receiving a request with a cache cleaning mark sent by the management background.
Optionally, in the method for cleaning cache data according to the present invention, deleting cache data corresponding to the source data includes: determining a cache key value according to the source data; and deleting the cache data corresponding to the cache key value.
Optionally, in the method for cleaning cache data according to the present invention, the method further includes: and adding a cache annotation in each processing function corresponding to the processing request in advance.
According to one aspect of the present invention, there is provided a cached data cleaning device residing in a computing apparatus, the device comprising: the judging module is suitable for responding to a request for acquiring source data and judging whether the request contains a cache cleaning mark or not; a recording module adapted to record a cache clean-up tag, if included, to a context in which the request is processed; and the execution module is suitable for deleting the cache data corresponding to the source data for each processing function with the cache annotation for processing the request call, and executing the processing function.
According to one aspect of the invention, there is provided a computing device comprising: at least one processor; a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the cache data scrubbing method as described above.
According to one aspect of the present invention, there is provided a readable storage medium storing program instructions that, when read and executed by a computing device, cause the computing device to perform a buffered data cleaning method as described above.
According to the technical scheme of the invention, when a request containing a cache cleaning mark is received, the cache cleaning mark in the request is recorded into the context of a processing request, in the process of processing the request, for each processing function with a cache annotation called by the processing request, the cache data corresponding to source data is deleted respectively, and the processing function is executed to update the cache data of the source data until the calling of all the processing functions with the cache annotation is completed, so that the processing of the request is completed. Therefore, in the request processing process, the method and the device can realize automatic cleaning and updating of the cache data of the source data based on each called processing function, the cleaning process of the cache data is simpler and more efficient, and unified cleaning and updating of all the cache data on the request link are realized.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which set forth the various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to fall within the scope of the claimed subject matter. The above, as well as additional objects, features, and advantages of the present disclosure will become more apparent from the following detailed description when read in conjunction with the accompanying drawings. Like reference numerals generally refer to like parts or elements throughout the present disclosure.
FIG. 1 shows a schematic diagram of a computing device 100 according to one embodiment of the invention;
FIG. 2 illustrates a flow diagram of a method 200 of flushing cached data according to one embodiment of the invention;
FIG. 3 is a flow diagram of a method for flushing cached data according to one embodiment of the invention;
fig. 4 shows a schematic diagram of a buffered data cleaning device 400 according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 shows a schematic diagram of a computing device 100 according to one embodiment of the invention. As shown in FIG. 1, in a basic configuration, computing device 100 includes at least one processing unit 102 and a system memory 104. According to one aspect, the processing unit 102 may be implemented as a processor, depending on the configuration and type of computing device. The system memory 104 includes, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read only memory), flash memory, or any combination of such memories. According to one aspect, an operating system 105 is included in system memory 104.
According to one aspect, operating system 105 is suitable, for example, for controlling the operation of computing device 100. Further, examples are practiced in connection with a graphics library, other operating systems, or any other application program and are not limited to any particular application or system. This basic configuration is illustrated in fig. 1 by those components within the dashed line. According to one aspect, computing device 100 has additional features or functionality. For example, according to one aspect, computing device 100 includes additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated in fig. 1 by removable storage device 109 and non-removable storage device 110.
As set forth hereinabove, according to one aspect, the program module 103 is stored in the system memory 104. According to one aspect, program modules 103 may include one or more applications, the invention is not limited in the type of application, for example, the application may include: email and contacts applications, word processing applications, spreadsheet applications, database applications, slide show applications, drawing or computer-aided application, web browser applications, etc.
According to one aspect, the program module 103 may comprise a cache data cleaning apparatus 400, the cache data cleaning apparatus 400 comprising a plurality of program instructions adapted to perform the cache data cleaning method 200 of the present invention, in order to clean and update cache data by performing the cache data cleaning method 200 of the present invention.
According to one aspect, the examples may be practiced in a circuit comprising discrete electronic components, a packaged or integrated electronic chip containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic components or a microprocessor. For example, examples may be practiced via a system on a chip (SOC) in which each or many of the components shown in fig. 1 may be integrated on a single integrated circuit. According to one aspect, such SOC devices may include one or more processing units, graphics units, communication units, system virtualization units, and various application functions, all of which are integrated (or "burned") onto a chip substrate as a single integrated circuit. When operating via an SOC, the functionality described herein may be operated via dedicated logic integrated with other components of computing device 100 on a single integrated circuit (chip). Embodiments of the invention may also be practiced using other techniques capable of performing logical operations (e.g., AND, OR, AND NOT), including but NOT limited to mechanical, optical, fluidic, AND quantum techniques. In addition, embodiments of the invention may be practiced within a general purpose computer or in any other circuit or system.
According to one aspect, the computing device 100 may also have one or more input devices 112, such as a keyboard, mouse, pen, voice input device, touch input device, and the like. Output device(s) 114 such as a display, speakers, printer, etc. may also be included. The foregoing devices are examples and other devices may also be used. Computing device 100 may include one or more communication connections 116 that allow communication with other computing devices 118. Examples of suitable communication connections 116 include, but are not limited to: RF transmitter, receiver and/or transceiver circuitry; universal Serial Bus (USB), parallel and/or serial ports.
The term computer readable media as used herein includes computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information (e.g., computer readable instructions, data structures, or program modules 103). System memory 104, removable storage 109, and non-removable storage 110 are all examples of computer storage media (i.e., memory storage). Computer storage media may include Random Access Memory (RAM), read Only Memory (ROM), electrically erasable read only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture that can be used to store information and that can be accessed by computing device 100. According to one aspect, any such computer storage media may be part of computing device 100. Computer storage media does not include a carrier wave or other propagated data signal.
According to one aspect, communication media is embodied by computer readable instructions, data structures, program modules 103, or other data in a modulated data signal (e.g., carrier wave or other transport mechanism) and includes any information delivery media. According to one aspect, the term "modulated data signal" describes a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio Frequency (RF), infrared, and other wireless media.
In an embodiment according to the invention, the computing device 100 is configured to perform the cached data cleaning method 200 according to the invention. The computing device 100 includes one or more processors and one or more readable storage media storing program instructions that, when configured to be executed by the one or more processors, cause the computing device to perform the cache data cleaning method 200 in an embodiment of the invention.
According to one embodiment of the invention, a cache data cleaning apparatus 400 is deployed in a computing device 100, the cache data cleaning apparatus 400 being configured to perform the cache data cleaning method 200 according to the invention. The cache data cleaning apparatus 400 includes a plurality of program instructions for executing the cache data cleaning method 200 of the present invention, where the program instructions may instruct a processor to execute the cache data cleaning method 200 according to the present invention, so as to implement cleaning and updating of cache data by executing the cache data cleaning method 200 of the present invention.
FIG. 2 illustrates a flow diagram of a method 200 for flushing cached data according to one embodiment of the invention. FIG. 3 is a flow diagram of a method for flushing cached data according to one embodiment of the invention. The buffered data cleaning method 200 is adapted to be executed in a buffered data cleaning apparatus 400 of a computing device, such as the aforementioned computing device 100.
As shown in fig. 2 and 3, method 200 begins at step 210.
First, in step 210, in response to a request to acquire source data, it is determined whether a cache clean-up tag is included in the request before processing the request. Here, the cache clean up flag is a parameter to clean up the cache.
Specifically, before processing the request, whether the request contains a cache cleaning mark can be judged by a section-oriented programming mode.
In one embodiment, the computing devices are communicatively connected to clients, management daemons (management daemons of source data), respectively. It should be noted that, when a user needs to access source data, the client may request to obtain the source data from the computing device 100. When an operation to clean up cached data needs to be triggered, a developer may send a request with a cache clean up flag to computing device 100 in the management background.
In this embodiment, the computing device 100 may receive a request to obtain source data sent by a client before responding to the request to obtain source data, and may receive a request with a cache clean-up flag sent in the management background.
The source data refers to the most original data that can be stored in the database. Because the storage structure of the database may not be consistent with the structure exhibited by the client/application, processing and handling are often required when reading the source data, resulting in slow source data reading speeds. And by caching the source data, the access speed can be improved by reading the cached data. However, since the cache data is often inconsistent with the generation of the source data, deletion and update of the cache data are required.
If it is determined that the request includes a cache clean-up tag, step 220 is performed.
In step 220, the cache clean-up tag is recorded to the context of the processing request. Next, one or more processing functions required to process the request may be initiated.
In addition, if it is determined that the cache clean-up tag is not included in the request, the one or more processing functions required to process the request are directly started to be invoked.
Finally, in step 230, for each processing function with a cache annotation called by the processing request, the cache data corresponding to the source data is deleted before executing the processing function. Thus, the cleaning operation of the cache data is realized so as to update the cache data. After deleting the cache data corresponding to the source data, the processing function can be executed to complete the call of the processing function and update the cache data of the source data.
For executing all processing functions corresponding to the request, the step 230 is executed respectively, so that the processing of the request can be finished, and the cleaning and updating of the cache data of the source data can be realized.
Here, it should be further noted that before deleting the cache data corresponding to the source data, it may be determined whether the cache cleaning flag exists in the context of the processing request, and if the cache cleaning flag exists, the cache data corresponding to the source data is deleted. If the cache clean-up tag does not exist in the context of the processing request, the processing function described above may be performed directly.
In one embodiment, a cache key corresponding to source data may be computationally determined from the source data. And deleting the cache data corresponding to the cache key value, thereby deleting the cache data corresponding to the source data.
In one embodiment, in step 230, for each processing function called by a processing request, a determination is first made as to whether a cache annotation exists in the processing function before the processing function is executed. Here, the Cache annotation is, for example, an @ Cache or an @ cached annotation.
If the processing function has the cache annotation, the processing function is determined to be the processing function with the cache annotation, and at the moment, whether the context of the processing request has the cache cleaning mark is further judged. And if the context of the processing request is determined to have the cache cleaning mark, deleting the cache data corresponding to the source data.
It should also be noted that before performing the method 200 of the present invention, a Cache annotation (e.g., an @ Cache or @ cached annotation) may be added in advance to each processing function corresponding to a processing request (i.e., a request to obtain source data). Here, the processing function, namely: and for the source data needing to be cached, acquiring a function needing to be called when the source data is acquired.
After the execution of the processing function described above (with cache annotation) the re-caching of the source data may be achieved. Specifically, by executing the processing function with the cache annotation, new cache data corresponding to the source data can be determined, and the new cache data corresponding to the source data is stored in the cache, so that the cache data of the source data is updated.
In one embodiment, when executing the processing function, it is also determined whether a third party service interface needs to be invoked.
If the third party service interface needs to be called, the cache cleaning mark is automatically added into the third party request parameter, and the third party service interface is called based on the third party request parameter after the cache cleaning mark is added. If the third party service interface is not required to be called, the processing function is continued to be executed.
In addition, while continuing to execute the processing function, it is further determined whether other processing functions need to be called, and if so, logic similar to that of step 230 is executed for the other processing functions.
It should be noted that, according to the above embodiment of the present invention, when the third party service interface needs to be invoked, the cache cleaning flag is automatically added to the third party request parameter, so that cleaning and updating of the cache data can be implemented for the cross-application scenario.
It should be noted that, for each processing function with a cache annotation called by the processing request, the present invention implements cleaning and updating of the cache data by executing the above step 230, until executing the call of all the processing functions with the cache annotation to complete the processing of the request, and implement unified cleaning and updating of all the cache data on the request link.
Fig. 4 shows a schematic diagram of a buffered data cleaning device 400 according to an embodiment of the present invention. The cache data cleaning apparatus 400 resides in the computing device 100. The cache data cleaning apparatus 400 may be configured to perform the cache data cleaning method 200 of the present invention.
As shown in fig. 4, the cache data cleaning apparatus 400 includes a judging module 410, a recording module 420, and an executing module 430, which are sequentially connected in communication.
The determining module 410 may determine, in response to a request for obtaining source data, whether the request includes a cache cleaning flag. If a cache clean-up tag is included, the logging module 420 may log the cache clean-up tag to the context that handled the request. The execution module may delete the cached data corresponding to the source data for each processing function with a cache annotation that processes the request call, and execute the processing function.
It should be noted that the determining module 410 is configured to perform the foregoing step 210, the recording module 420 is configured to perform the foregoing step 220, and the executing module 430 is configured to perform the foregoing step 230. Here, the specific execution logic of the determining module 410, the recording module 420, and the executing module 430 may be referred to the descriptions of the steps 210 to 230 in the foregoing method 200, and will not be repeated here.
According to the cache data cleaning method 200 of the present invention, when a request containing a cache cleaning mark is received, the cache cleaning mark in the request is recorded to the context of the processing request, in the process of processing the request, for each processing function with a cache annotation called by the processing request, the cache data corresponding to the source data is deleted, and the processing function is executed to update the cache data of the source data until the execution completes the call of all the processing functions with the cache annotation, so as to complete the processing of the request. Therefore, in the request processing process, the method and the device can realize automatic cleaning and updating of the cache data of the source data based on each called processing function, the cleaning process of the cache data is simpler and more efficient, and unified cleaning and updating of all the cache data on the request link are realized.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions of the methods and apparatus of the present invention, may take the form of program code (i.e., instructions) embodied in tangible media, such as removable hard drives, U-drives, floppy diskettes, CD-ROMs, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the mobile terminal will generally include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the cache data scrubbing method of the present invention in accordance with instructions in said program code stored in the memory.
By way of example, and not limitation, readable media comprise readable storage media and communication media. The readable storage medium stores information such as computer readable instructions, data structures, program modules, or other data. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. Combinations of any of the above are also included within the scope of readable media.
In the description provided herein, algorithms and displays are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with examples of the invention. The required structure for a construction of such a system is apparent from the description above. In addition, the present invention is not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment, or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into a plurality of sub-modules.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component and, furthermore, they may be divided into a plurality of sub-modules or sub-units or sub-components.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments.
Furthermore, some of the embodiments are described herein as methods or combinations of method elements that may be implemented by a processor of a computer system or by other means of performing the functions. Thus, a processor with the necessary instructions for implementing the described method or method element forms a means for implementing the method or method element. Furthermore, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is for carrying out the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal terms "first," "second," "third," etc., to describe a general object merely denote different instances of like objects, and are not intended to imply that the objects so described must have a given order, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above description, will appreciate that other embodiments are contemplated within the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.

Claims (10)

1. A method of cache data scrubbing, performed in a computing device, the method comprising:
responding to a request for acquiring source data, and judging whether the request contains a cache cleaning mark or not;
if the cache cleaning mark is contained, recording the cache cleaning mark to a context for processing the request;
and deleting the cache data corresponding to the source data for each processing function with the cache annotation for processing the request call, and executing the processing function.
2. The method of claim 1, wherein deleting the cached data corresponding to the source data comprises:
judging whether a cache cleaning mark exists in a context for processing the request;
and if the cache cleaning mark exists, deleting the cache data corresponding to the source data.
3. The method of claim 1 or 2, wherein for each processing function with a cache annotation that processes the request call, deleting the cache data corresponding to the source data comprises:
judging whether a cache annotation exists in each processing function for processing the request call;
if the cache annotation exists, judging whether a cache cleaning mark exists in the context for processing the request;
and if the cache cleaning mark exists, deleting the cache data corresponding to the source data.
4. A method according to any of claims 1-3, wherein when executing the processing function, comprises:
judging whether a third party service interface needs to be called or not;
if yes, the cache cleaning mark is added to a third party request parameter, and the third party service interface is called based on the third party request parameter.
5. The method of any of claims 1-4, wherein the computing device is communicatively connected with a client, a management background, respectively; before responding to a request to acquire source data, comprising:
receiving a request of a client for acquiring source data;
and receiving a request with a cache cleaning mark sent by the management background.
6. The method of any of claims 1-5, wherein deleting the cached data corresponding to the source data comprises:
determining a cache key value according to the source data;
and deleting the cache data corresponding to the cache key value.
7. The method of any of claims 1-6, further comprising:
and adding a cache annotation in each processing function corresponding to the processing request in advance.
8. A cache data cleaning apparatus residing in a computing device, the apparatus comprising:
the judging module is suitable for responding to a request for acquiring source data and judging whether the request contains a cache cleaning mark or not;
a recording module adapted to record a cache clean-up tag, if included, to a context in which the request is processed;
and the execution module is suitable for deleting the cache data corresponding to the source data for each processing function with the cache annotation for processing the request call, and executing the processing function.
9. A computing device, comprising:
at least one processor; and
a memory storing program instructions, wherein the program instructions are configured to be adapted to be executed by the at least one processor, the program instructions comprising instructions for performing the method of any of claims 1-7.
10. A readable storage medium storing program instructions which, when read and executed by a computing device, cause the computing device to perform the method of any of claims 1-7.
CN202310007415.0A 2023-01-04 2023-01-04 Cache data cleaning method and device, computing equipment and storage medium Pending CN116257181A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310007415.0A CN116257181A (en) 2023-01-04 2023-01-04 Cache data cleaning method and device, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310007415.0A CN116257181A (en) 2023-01-04 2023-01-04 Cache data cleaning method and device, computing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116257181A true CN116257181A (en) 2023-06-13

Family

ID=86685494

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310007415.0A Pending CN116257181A (en) 2023-01-04 2023-01-04 Cache data cleaning method and device, computing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116257181A (en)

Similar Documents

Publication Publication Date Title
US8719505B2 (en) Method for increasing cache size
US7251808B2 (en) Graphical debugger with loadmap display manager and custom record display manager displaying user selected customized records from bound program objects
CN110865888A (en) Resource loading method and device, server and storage medium
CN114925084B (en) Distributed transaction processing method, system, equipment and readable storage medium
CN111258563A (en) Interface document generation method and device, storage medium and electronic equipment
WO2023160327A1 (en) Container image management
US20130007377A1 (en) Message oriented middleware with integrated rules engine
CN112416710A (en) User operation recording method and device, electronic equipment and storage medium
CN111562929A (en) Method, device and equipment for generating patch file and storage medium
CN114329366A (en) Network disk file control method and device, network disk and storage medium
CN113312008B (en) Processing method, system, equipment and medium for file read-write service
CN110990346A (en) File data processing method, device, equipment and storage medium based on block chain
CN111782474A (en) Log processing method and device, electronic equipment and medium
US7747627B1 (en) Method and system for file retrieval using image virtual file system
CN114816772B (en) Debugging method, debugging system and computing device for application running based on compatible layer
CN114936010B (en) Data processing method, device, equipment and medium
CN116257181A (en) Cache data cleaning method and device, computing equipment and storage medium
US20230006814A1 (en) Method and apparatus for implementing changes to a file system that is emulated with an object storage system
CN115185426B (en) Data processing method and device for tree control and computing equipment
JPH1165895A (en) System and method for managing logging file
US11784661B2 (en) Method for compressing behavior event in computer and computer device therefor
KR102276345B1 (en) Compression method for behavior event on computer
CN113821458B (en) Data operation method, device, computer equipment and storage medium
CN118055064A (en) Route information synchronization method, device, computing equipment and storage medium
CN117891723A (en) Code checking method, computing device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination