CN112540811A - Cache data detection method and device, computer equipment and storage medium - Google Patents

Cache data detection method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112540811A
CN112540811A CN202011479824.3A CN202011479824A CN112540811A CN 112540811 A CN112540811 A CN 112540811A CN 202011479824 A CN202011479824 A CN 202011479824A CN 112540811 A CN112540811 A CN 112540811A
Authority
CN
China
Prior art keywords
cache
target
key
cache data
request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011479824.3A
Other languages
Chinese (zh)
Other versions
CN112540811B (en
Inventor
陈志城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202011479824.3A priority Critical patent/CN112540811B/en
Publication of CN112540811A publication Critical patent/CN112540811A/en
Priority to PCT/CN2021/091713 priority patent/WO2022126984A1/en
Application granted granted Critical
Publication of CN112540811B publication Critical patent/CN112540811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/448Execution paradigms, e.g. implementations of programming paradigms
    • G06F9/4482Procedural

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The application relates to the technical field of data processing, and provides a detection method, a detection device, computer equipment and a storage medium for cache data, wherein the method comprises the following steps: when an input cache detection request is received, acquiring a cache detection script corresponding to the item information; running a cache detection script to obtain js codes of the target items; generating an abstract syntax tree corresponding to the js code; traversing all objects in the abstract syntax tree, and acquiring first keys of all first cache data contained in js codes; acquiring a second key of the second cache data subjected to cache clearing processing; and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key and the first key, which is included in the first key, as the third key with the matching failure. By the method and the device, the target cache data which is not subjected to clearing processing in the local cache can be quickly detected. The method and the device can also be applied to the field of block chains, and the data such as the target cache data can be stored on the block chains.

Description

Cache data detection method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and an apparatus for detecting cache data, a computer device, and a storage medium.
Background
Since the introduction of Localstorage (which may be referred to as local storage) by HTML5, local storage has found widespread use in the frontend domain.
The local storage is mainly operated by js codes and is used for directly storing part of frequently-used and infrequently-changed data in the project, such as user information, static data and the like, to the local of the browser so as to be convenient for calling in multiple places in the project. After the developer has used the project cache data in the local storage, if the project cache data in the local storage permanently exists in the local browser and is not cleared in time, problems such as error reporting of project codes may occur. When a user has a need to detect whether item cache data which is not cleared in a local storage exists, how to quickly detect the item cache data which is not cleared in the local storage to meet the user need becomes a problem to be solved urgently.
Disclosure of Invention
The application mainly aims to provide a cache data detection method, a cache data detection device, computer equipment and a storage medium, and aims to solve the technical problem of how to quickly detect item cache data which is not subjected to clearing processing in local storage in the prior art.
The application provides a detection method of cache data, which comprises the following steps:
judging whether an input cache detection request is received, wherein the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request;
the cache detection script is operated, and js codes of the target items are obtained based on the operation logic of the cache detection script;
generating an abstract syntax tree corresponding to the js code;
traversing all objects in the abstract syntax tree, acquiring first keys corresponding to each first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
acquiring second keys corresponding to second cache data subjected to cache clearing processing in the js codes one by one through a preset second acquisition method, and storing all the second keys in a preset second array;
respectively matching the designated key with each second key contained in the second array to detect whether a target key identical to the designated key exists in the second array or not to obtain a corresponding matching result, wherein the designated key is any one of all the first keys contained in the first array, and the matching result comprises matching success or matching failure;
if the matching result is matching failure, judging that the designated cache data corresponding to the designated key is not cleared;
and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key in the second array and the first key in the first array as the third key with failed matching.
Optionally, before the step of obtaining a pre-stored cache detection script corresponding to the item information based on the cache detection request, the method includes:
judging whether other task requests to be processed except the cache detection request exist or not;
if other task requests to be processed except the cache detection request exist, acquiring the request quantity of the other task requests;
judging whether the request quantity is larger than a preset quantity threshold value or not;
if the number of the requests is larger than the number threshold, screening out a specified number of target task requests from the other task requests according to a preset rule;
judging whether a specific task request with unadjustable processing time exists in the target task;
if the specific task request exists in the target task, the specific task is removed from the target task request to obtain a removed target task request;
and adjusting the removed target task request to a preset idle time period for processing, wherein the idle time period is different from the processing time period of the cache detection request.
Optionally, the step of screening out a specified number of target task requests from the other task requests according to a preset rule includes:
determining a resource consumption amount of each of the other task requests;
sequencing all the other task requests according to the sequence of the resource consumption from large to small to obtain a corresponding first sequencing result;
sequentially acquiring a plurality of first task requests with the same number as the specified number from other task requests ranked at the head in the first ranking result;
all the first task requests are taken as the target task requests.
Optionally, the step of screening out a specified number of target task requests from the other task requests according to a preset rule includes:
acquiring the processing priority of each other task request based on a preset task request priority table;
sequencing all the other task requests according to the sequence of the processing priorities from low to high to obtain a corresponding second sequencing result;
sequentially acquiring a plurality of second task requests with the same number as the specified number from other task requests ranked at the head in the second sequencing result;
and taking the second task request as the target task request.
Optionally, after the step of obtaining and displaying target cache data corresponding to a third key based on a matching result between a second key in the second array and included in the first array as the third key with a matching failure, the method includes:
acquiring the write-in time of the target cache data; and the number of the first and second groups,
acquiring the expiration duration of the target cache data;
calculating an expiration time point corresponding to the target cache data based on the writing time and the expiration time;
acquiring current time, and judging whether the current time exceeds the expiration time point;
if the current time exceeds the expiration time point, clearing the target cache data;
if the current time does not exceed the expiration time point, calculating the difference value between the expiration time point and the current time;
and generating corresponding overdue reminding information based on the target cache data and the difference value, and displaying the overdue reminding information.
Optionally, before the step of obtaining a pre-stored cache detection script corresponding to the item information based on the cache detection request, the method includes:
acquiring a pre-stored detection script template based on the cache detection request;
analyzing the cache detection request, and extracting the project information;
filling the detection script template by using the project information to obtain a filled detection script template;
and taking the processed detection script template as the cache detection script.
Optionally, after the step of obtaining and displaying target cache data corresponding to a third key based on a matching result between a second key in the second array and included in the first array as the third key with a matching failure, the method includes:
generating alarm information corresponding to the target cache data based on the target cache data;
acquiring preset mail login information and acquiring a preset mail address;
logging in to a corresponding mail server based on the mail login information;
and sending the alarm information to the preset mail address through the mail server.
The present application further provides a detection apparatus for cache data, including:
the system comprises a first judgment module, a first cache module and a second judgment module, wherein the first judgment module is used for judging whether an input cache detection request is received, the cache detection request carries item information corresponding to a target item, and the cache detection request is used for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
the first obtaining module is used for obtaining a pre-stored cache detection script corresponding to the item information based on an input cache detection request if the input cache detection request is received;
the second obtaining module is used for running the cache detection script and obtaining the js code of the target item based on the running logic of the cache detection script;
the first generation module is used for generating an abstract syntax tree corresponding to the js code;
the first storage module is used for traversing all objects in the abstract syntax tree, acquiring first keys corresponding to all first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
the second storage module is used for the first judgment module and is used for acquiring second keys which correspond to the second cache data subjected to cache clearing processing in the js code in a one-to-one manner through a preset second acquisition method and storing all the second keys in a preset second array;
the processing module is configured to perform matching processing on an assigned key and each second key included in the second array respectively to detect whether a target key identical to the assigned key exists in the second array, so as to obtain a corresponding matching result, where the assigned key is any one of all first keys included in the first array, and the matching result includes a matching success or a matching failure;
the judging module is used for judging that the designated cache data corresponding to the designated key is not cleared if the matching result is matching failure;
and the display module is used for acquiring and displaying target cache data corresponding to a third key which fails in matching based on a matching result between the second key in the second array and the first key in the first array.
The present application further provides a computer device, comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the above method when executing the computer program.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the above-mentioned method.
The detection method, the detection device, the computer equipment and the storage medium for the cache data have the following beneficial effects:
according to the detection method, the detection device, the computer equipment and the storage medium for the cache data, after the input cache detection request is received, the detection script corresponding to the item information carried in the cache detection request can be obtained and executed. And then based on the execution logic of the detection script, generating a corresponding abstract syntax tree by using js codes of the target item, and further, according to the abstract syntax tree, rapidly acquiring first keys corresponding to first cache data contained in the js codes one by adopting a corresponding acquisition method, and acquiring second keys corresponding to second cache data subjected to cache removal processing in the js codes one by one. And finally, screening a third key with the matching result between the first key and the second key being the matching failure from the first key, wherein the extracted target cache data corresponding to the third key is the item cache data which is not subjected to the clearing processing in the local storage. According to the cache detection method and device, the js codes corresponding to the target items are analyzed through the abstract syntax tree, then the cache detection script is used for achieving automatic processing of cache detection, cache data which are not subjected to clearing processing in the local cache can be detected quickly, and the detection efficiency of the cache data is improved. In addition, by displaying the target cache data, a related user can timely know the target cache data which is not cleared in the local storage, and then timely clearing processing can be subsequently performed on the overdue cache data in the target cache data contained in the local storage according to actual conditions, so that program bug caused by the problem that the cache is not cleared can be reduced, and the quality of the project code can be improved.
Drawings
Fig. 1 is a schematic flowchart of a method for detecting cache data according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a device for detecting cache data according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a computer device according to an embodiment of the present application.
The implementation, functional features and advantages of the objectives of the present application will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Referring to fig. 1, a method for detecting cache data according to an embodiment of the present application includes:
s1: judging whether an input cache detection request is received, wherein the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
s2: if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request;
s3: the cache detection script is operated, and js codes of the target items are obtained based on the operation logic of the cache detection script;
s4: generating an abstract syntax tree corresponding to the js code;
s5: traversing all objects in the abstract syntax tree, acquiring first keys corresponding to each first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
s6: acquiring second keys corresponding to second cache data subjected to cache clearing processing in the js codes one by one through a preset second acquisition method, and storing all the second keys in a preset second array;
s7: respectively matching the designated key with each second key contained in the second array to detect whether a target key identical to the designated key exists in the second array or not to obtain a corresponding matching result, wherein the designated key is any one of all the first keys contained in the first array, and the matching result comprises matching success or matching failure;
s8: if the matching result is matching failure, judging that the designated cache data corresponding to the designated key is not cleared;
s9: and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key in the second array and the first key in the first array as the third key with failed matching.
When the browser uses local storage to perform data caching on related data in the item, the related data is stored in a key/value pair format. A key/value pair may be referred to as a "cached data" or may also be referred to as a cached item, e.g., "key 1-value 1" is a cached item and "key 2-value 2" is another cached item. For the cache item of key/value, the key can be called index key and can be used for retrieving value; the value may be referred to as a data object, the data object may be data that is actually stored to a local storage by the client, for example, the data may be 1001, and if a certain client needs to acquire the data 1001, the key corresponding to the data may be used for retrieval.
As described in the above steps S1-S9, the main implementation of the embodiment of the method is a detection device for cache data. In practical applications, the detection device for the cached data may be implemented by a virtual device, such as a software code, or by an entity device in which a relevant execution code is written or integrated, and may perform human-computer interaction with a user through a keyboard, a mouse, a remote controller, a touch panel, or a voice control device. The detection device for the cache data may be a browser. The device for detecting cache data in the embodiment can quickly detect cache data which is not subjected to the clearing processing in the local cache. Specifically, it is first determined whether an input cache detection request is received, where the cache detection request carries item information corresponding to a target item, the target item may be any business item, and the item information may specifically include an item name of the target item. The cache request is a request for detecting item cache data, which is not subjected to the clearing process, corresponding to the target item and is included in the local storage of the browser. And if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request. The item information in the cache detection request can be analyzed, and then the item information is filled into a preset detection script template to generate the cache detection script. And then, the cache detection script is operated, and the js code of the target item is acquired based on the operation logic of the cache detection script. Wherein the js code associated with the target item may be retrieved by querying an associated code database, such as GitHub. After the js code is obtained, an abstract syntax tree corresponding to the js code is generated. The Abstract Syntax Tree (AST) or Syntax Tree for short is an Abstract representation of the Syntax structure of the source code. It represents the syntactic structure of the programming language in the form of a tree, each node on the tree representing a structure in the source code. In addition, the abstract syntax tree corresponding to the js code described above may be generated based on a correlation generation tool, such as by an esprima tool. And then traversing all objects in the abstract syntax tree, acquiring first keys corresponding to the first cache data contained in the js code one by a preset first acquisition method, and storing all the first keys in a preset first array. The abstract syntax tree is an array formed by a plurality of objects, the JavaScript statement is divided into one part, namely the object, the js code is analyzed by using the abstract syntax tree, the cache data contained in the whole js code can be comprehensively retrieved, all the used cache data in the js code can be quickly identified, and the omission phenomenon is avoided. In addition, the first obtaining method is a setItem () method, and the setItem () method can obtain the first keys corresponding to the first cache data included in the js code one by one, so that the cache data which is not subjected to the clearing process in the js code can be quickly detected according to the first keys. In addition, the preset first array does not contain data. And acquiring second keys corresponding to the second cache data subjected to cache clearing processing in the js code one by one through a preset second acquisition method, and storing all the second keys in a preset second array. The second obtaining method is a removeItem () method, the js code of the target item is analyzed based on the abstract syntax tree, whether all used cache data in the js code are subjected to clearing processing can be quickly identified, and second keys respectively corresponding to the cache data which are not subjected to clearing processing and are contained in the js code can be extracted, so that the cache data which are not subjected to clearing processing in the js code can be quickly detected according to the second keys in the following process. And then, respectively performing matching processing on the designated key and each second key included in the second array to detect whether a target key identical to the designated key exists in the second array or not, so as to obtain a corresponding matching result, wherein the designated key is any one of all the first keys included in the first array, the matching result includes matching success or matching failure, and the preset second array does not include data. In addition, whether the designated key is identical to any one of the second keys included in the second array may be compared based on a parallel data comparison instruction. The parallel data comparison instruction may be a single instruction stream multiple data (SIMD) instruction, and the data matching processing between the designated key and any one of the second keys included in the second array is performed at the same time by using the parallel computing capability of the parallel data comparison instruction, which is helpful to further increase the processing rate of data matching and increase the generation speed of the generated data matching result. And if the matching result is that the matching fails, judging that the designated cache data corresponding to the designated key is not subjected to the clearing processing. If the designated key exists only in the first array and not in the second array, the designated cache data corresponding to the designated key is judged not to be cleared. Only when the designated key exists in the first array and the second array at the same time, the designated cache data corresponding to the designated key can be judged to be the data which is subjected to the clearing processing. And finally, acquiring and displaying target cache data corresponding to a third key which fails in matching based on a matching result between the second key in the second array and the first key in the first array. The matching result of the matching failure means that, for the third key included in the first array, the same key as the third key does not exist in all the second keys included in the second array. In the embodiment, the js code corresponding to the target item is analyzed by using the abstract syntax tree, and then the cache detection script is used for realizing the automatic processing of the cache detection, so that the cache data which is not subjected to the clearing processing in the local cache can be quickly detected, and the detection efficiency of the cache data is improved. In addition, by displaying the target cache data, a related user can timely know the target cache data which is not cleared in the local storage, and then timely clearing processing can be subsequently performed on the overdue cache data in the target cache data contained in the local storage according to actual conditions, so that program bug caused by the problem that the cache is not cleared can be reduced, and the quality of the project code can be improved.
Further, in an embodiment of the present application, before the step S2, the method includes:
s200: judging whether other task requests to be processed except the cache detection request exist or not;
s201: if other task requests to be processed except the cache detection request exist, acquiring the request quantity of the other task requests;
s202: judging whether the request quantity is larger than a preset quantity threshold value or not;
s203: if the number of the requests is larger than the number threshold, screening out a specified number of target task requests from the other task requests according to a preset rule;
s204: judging whether a specific task request with unadjustable processing time exists in the target task;
s205: if the specific task request exists in the target task, the specific task is removed from the target task request to obtain a removed target task request;
s206: and adjusting the removed target task request to a preset idle time period for processing, wherein the idle time period is different from the processing time period of the cache detection request.
As described in the foregoing steps S200 to S206, in the process of processing the cache detection request, some other task requests to be processed currently exist are also intelligently allocated to another time period different from the current time period to be processed, so as to avoid that the other task requests affect the normal processing of the current cache detection request, thereby effectively ensuring the normal processing of the cache detection request. Specifically, the step of obtaining a pre-stored cache detection script corresponding to the item information based on the cache detection request may further include: firstly, whether other task requests to be processed except the cache detection request exist is judged. And if other task requests to be processed except the cache detection request exist, acquiring the request quantity of the other task requests. And then judging whether the number of the requests is larger than a preset number threshold value. The number threshold is not particularly limited, and may be set according to actual requirements, for example, the number threshold may be generated according to an empirical value obtained by statistically analyzing the relevant historical data. And if the number of the requests is larger than the number threshold, screening out a specified number of target task requests from the other task requests according to a preset rule. The specified number is not specifically limited, and can be determined according to actual requirements, and the specified number is not more than the number of the other task requests. In addition, the preset rule is not particularly limited, and for example, the preset rule may include: selecting a specified number of task requests with the maximum processing resource consumption from the other task requests as the target task request based on the resource consumption of each other task request; or the preset rule may further include: and screening a specified number of task requests with the lowest processing priority from the other task requests as the target task request based on the processing priority of each other task request. And then judging whether a specific task request with unadjustable processing time exists in the target task. Whether the task request belongs to the specific task request with the task time not adjustable can be judged based on the task attribute of the task request, a corresponding rule can be set in advance, and if the task attribute of the task request meets a preset condition, the task request is determined to belong to the specific task request with the task time not adjustable. The preset condition may be set according to an actual requirement, for example, the preset condition may be that a service to which the task request belongs is a preset specified service, or that the task request belongs to a preset specified service type. And if the specific task request exists in the target task, removing the specific task from the target task request to obtain the removed target task request. And finally, adjusting the removed target task request to a preset idle time period for processing, wherein the idle time period is different from the processing time period of the cache detection request.
Further, in an embodiment of the present application, the step S203 includes:
s2030: determining a resource consumption amount of each of the other task requests;
s2031: sequencing all the other task requests according to the sequence of the resource consumption from large to small to obtain a corresponding first sequencing result;
s2032: sequentially acquiring a plurality of first task requests with the same number as the specified number from other task requests ranked at the head in the first ranking result;
s2033: all the first task requests are taken as the target task requests.
As described in the foregoing steps S2030 to S2033, the step of screening the specified number of target task requests from the other task requests according to the preset rule may specifically include: the resource consumption of each of the other task requests is first determined. The resource consumption is memory space, CPU, or traffic that is required to be consumed when a task request is executed. Specifically, the historical resource consumption amount of each of the other task requests specified in the preset time period may be queried based on the resource consumption statistical data, and an average of all the specified historical resource consumption amounts is calculated as the resource consumption amount corresponding to the specified other task request, where the specified other task request is any one of all the other task requests. And then sequencing all the other task requests according to the sequence of the resource consumption from large to small to obtain a corresponding first sequencing result. And then, a plurality of first task requests with the same number as the specified number are sequentially acquired from other task requests arranged at the head in the first sequencing result. And finally, taking all the first task requests as the target task requests. In the embodiment, the first task request with a large resource consumption is extracted from all other task requests based on the resource consumption to serve as the target task request, so that the task request with the large resource consumption is subsequently and intelligently adjusted to the idle time different from the current time period for processing, the influence of the part of task requests on the normal processing of the current cache detection request can be avoided, the normal processing of the cache detection request is effectively ensured, and the intelligence of the processing of the cache detection request is improved.
Further, in an embodiment of the present application, the step S203 includes:
s2034: acquiring the processing priority of each other task request based on a preset task request priority table;
s2035: sequencing all the other task requests according to the sequence of the processing priorities from low to high to obtain a corresponding second sequencing result;
s2036: sequentially acquiring a plurality of second task requests with the same number as the specified number from other task requests ranked at the head in the second sequencing result;
s2037: and taking the second task request as the target task request.
As described in steps S2034 to S2037, the step of screening out a specified number of target task requests from the other task requests according to a preset rule includes: and acquiring the processing priority of each other task request based on a preset task request priority table. The task request priority table records request identifiers of different task requests and processing priority sequence numbers corresponding to the request identifiers one to one. The smaller the processing priority sequence number corresponding to the request identifier is, the higher the processing priority of the task request corresponding to the request identifier is. Sequencing all the other task requests according to the sequence of the processing priorities from low to high to obtain a corresponding second sequencing result; sequentially acquiring a plurality of second task requests with the same number as the specified number from other task requests ranked at the head in the second sequencing result; and taking the second task request as the target task request. In the embodiment, the second task request with the lower processing priority is extracted from all other task requests based on the processing priority to serve as the target task request, so that the task request with the lower processing priority is intelligently adjusted to the idle time different from the current time period for processing in the subsequent process, the influence of the part of task requests on the normal processing of the current cache detection request can be avoided, the normal processing of the cache detection request is effectively ensured, and the intelligence of the cache detection request processing is improved.
Further, in an embodiment of the present application, after the step S9, the method includes:
s900: acquiring the write-in time of the target cache data; and the number of the first and second groups,
s901: acquiring the expiration duration of the target cache data;
s902: calculating an expiration time point corresponding to the target cache data based on the writing time and the expiration time;
s903: acquiring current time, and judging whether the current time exceeds the expiration time point;
s904: if the current time exceeds the expiration time point, clearing the target cache data;
s905: if the current time does not exceed the expiration time point, calculating the difference value between the expiration time point and the current time;
s906: and generating corresponding overdue reminding information based on the target cache data and the difference value, and displaying the overdue reminding information.
As described in the above steps S900 to S906, if the storage time of the cache data in the local storage exceeds the specified lifetime, the cache data cannot be accessed, i.e. the cache data is in an expired state. The clearing processing operation performed on the cache data in the expired state may be referred to as life cycle management for the cache data. After the step of obtaining and displaying the target cache data corresponding to the third key based on the third key which is failed to match between the matching result contained in the first array and the second key in the second array is executed, the target cache data can be reasonably managed in a corresponding life cycle. Specifically, first, the write time of the target cache data and the expiration time of the target cache data are obtained. The write time refers to a time when the target cache data is first stored in the local storage. The expiration time refers to the length of the lifetime of the target cache data, and if the storage time of the target cache data in the local storage is longer than the expiration time, it indicates that the target cache data is in an expiration state, i.e., there is no actual use value. And then calculating an expiration time point corresponding to the target cache data based on the writing time and the expiration time. The sum of the write time and the expiration time is calculated, and the time corresponding to the sum is the expiration time. And then acquiring the current time, and judging whether the current time exceeds the expiration time point. And if the current time exceeds the expiration time point, clearing the target cache data. And if the current time does not exceed the expiration time, calculating the difference between the expiration time and the current time. And generating corresponding overdue reminding information based on the target cache data and the difference value, and displaying the overdue reminding information. Wherein, the overdue reminding information at least comprises target cache data and the difference value. After the target cache data which is not cleared is detected, when the target cache data is judged to be in the overdue state currently, the target cache data can be intelligently cleared, so that precious local storage resources are effectively saved, and the situation that data problems or even project code errors occur in subsequent links due to the fact that the overdue cache data are not cleared timely is avoided. In addition, if the target cache data is currently in the unexpired state, the remaining effective lifetime of the target cache data is calculated, and corresponding overdue reminding information is generated, so that a relevant user can clearly know the current lifetime information of the target cache data based on the overdue reminding information, and the subsequent clearing processing work of the target cache data can be timely executed.
Further, in an embodiment of the present application, before the step S2, the method includes:
s210: acquiring a pre-stored detection script template based on the cache detection request;
s211: analyzing the cache detection request, and extracting the project information;
s212: filling the detection script template by using the project information to obtain a filled detection script template;
s213: and taking the processed detection script template as the cache detection script.
As described in steps S210 to S213, before the step of obtaining the pre-stored cache detection script corresponding to the item information based on the cache detection request is performed, a generation process of generating the cache detection script may be further included. Specifically, first, a pre-stored detection script template is obtained based on the cache detection request. The detection script template corresponding to the cache detection is preset for the operation processing of the cache detection request, the detection script template can be compiled and generated by developers according to actual cache detection requirements, and variable parameters corresponding to item fields in the detection script template are to-be-filled states. And then analyzing the cache detection request to extract the item information. And then, filling the detection script template by using the item information to obtain the filled detection script template. The detection script template comprises detection codes used for executing item cache data detection, information filling positions corresponding to item fields in the detection template are determined firstly, and then the item information is filled to the information filling positions to generate cache detection scripts corresponding to target items, so that item variable parameters corresponding to different items are filled in the detection script template, and further cache detection scripts corresponding to the different items are generated. And finally, taking the processed detection script template as the cache detection script. According to the embodiment, the automatic processing of the cache detection is realized by using the detection script, so that the manual detection time of developers can be effectively saved, and the processing efficiency of the cache data detection is improved.
Further, in an embodiment of the present application, after the step S9, the method includes:
s910: generating alarm information corresponding to the target cache data based on the target cache data;
s911: acquiring preset mail login information and acquiring a preset mail address;
s912: logging in to a corresponding mail server based on the mail login information;
s913: and sending the alarm information to the preset mail address through the mail server.
As described in the foregoing steps S910 to S913, after the step of obtaining and presenting the target cache data corresponding to the third key based on the third key, which is included in the first array and has a matching result with the second key in the second array as a matching failure, a generation process of generating alarm information corresponding to the target cache data may be further included. Specifically, first, alarm information corresponding to the target cache data is generated based on the target cache data. The alarm information at least includes the target cache data, and the target cache data can be filled into a pre-created alarm information template to generate the alarm information. In addition, the alarm information template is compiled and generated by developers according to actual use requirements. And then acquiring preset mail login information and a preset mail address. The mail registration information is information for registering a mail server, and the preset mail address is mail address information of a user related to the reception of the alarm information. And then logging in the corresponding mail server based on the mail login information. And finally, sending the alarm information to the preset mail address through the mail server. In the embodiment, the alarm information corresponding to the target cache data is sent to the preset mail address by using the mail server, so that the relevant user can timely know the target cache data corresponding to the target project and not subjected to the clearing processing, and then the overdue cache data in the target cache data contained in the local storage can be timely cleared according to the actual situation, thereby effectively saving local storage resources, and avoiding the occurrence of data problems and even project code error in the subsequent link due to the fact that the overdue cache data is not timely cleared.
The method for detecting cache data in the embodiment of the present application may also be applied to the field of a block chain, for example, data such as the target cache data is stored in the block chain. By using the block chain to store and manage the target cache data, the security and the non-tamper property of the target cache data can be effectively ensured.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism and an encryption algorithm. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The block chain underlying platform can comprise processing modules such as user management, basic service, intelligent contract and operation monitoring. The user management module is responsible for identity information management of all blockchain participants, and comprises public and private key generation maintenance (account management), key management, user real identity and blockchain address corresponding relation maintenance (authority management) and the like, and under the authorization condition, the user management module supervises and audits the transaction condition of certain real identities and provides rule configuration (wind control audit) of risk control; the basic service module is deployed on all block chain node equipment and used for verifying the validity of the service request, recording the service request to storage after consensus on the valid request is completed, for a new service request, the basic service firstly performs interface adaptation analysis and authentication processing (interface adaptation), then encrypts service information (consensus management) through a consensus algorithm, transmits the service information to a shared account (network communication) completely and consistently after encryption, and performs recording and storage; the intelligent contract module is responsible for registering and issuing contracts, triggering the contracts and executing the contracts, developers can define contract logics through a certain programming language, issue the contract logics to a block chain (contract registration), call keys or other event triggering and executing according to the logics of contract clauses, complete the contract logics and simultaneously provide the function of upgrading and canceling the contracts; the operation monitoring module is mainly responsible for deployment, configuration modification, contract setting, cloud adaptation in the product release process and visual output of real-time states in product operation, such as: alarm, monitoring network conditions, monitoring node equipment health status, and the like.
Referring to fig. 2, an embodiment of the present application further provides a device for detecting cache data, including:
a first judging module 1, configured to judge whether an input cache detection request is received, where the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data, which is not subjected to a clearing process and is included in a local storage, and corresponds to the target item;
a first obtaining module 2, configured to, if an input cache detection request is received, obtain, based on the cache detection request, a pre-stored cache detection script corresponding to the item information;
the second obtaining module 3 is configured to run the cache detection script, and obtain a js code of the target item based on a running logic of the cache detection script;
a first generation module 4, configured to generate an abstract syntax tree corresponding to the js code;
the first storage module 5 is configured to traverse all objects in the abstract syntax tree, obtain first keys corresponding to each first cache data included in the js code one to one by using a preset first obtaining method, and store all the first keys in a preset first array; and the number of the first and second groups,
the second storage module 6 is used for the first judgment module and is used for acquiring second keys corresponding to the second cache data subjected to cache clearing processing in the js code in a one-to-one manner through a preset second acquisition method and storing all the second keys in a preset second array;
the processing module 7 is configured to perform matching processing on an assigned key and each second key included in the second array respectively to detect whether a target key identical to the assigned key exists in the second array, so as to obtain a corresponding matching result, where the assigned key is any one of all first keys included in the first array, and the matching result includes a matching success or a matching failure;
the judging module 8 is configured to judge that the designated cache data corresponding to the designated key is not cleared if the matching result is a matching failure;
and the display module 9 is configured to obtain and display target cache data corresponding to a third key, which is a matching failure based on a matching result between the second key in the second array and the first key included in the first array.
In this embodiment, the implementation processes of the functions and functions of the first determining module, the first obtaining module, the second obtaining module, the first generating module, the first storing module, the second storing module, the processing module, the determining module and the displaying module in the detection apparatus for cached data are specifically detailed in the implementation processes corresponding to steps S1 to S9 in the detection method for cached data, and are not described herein again.
Further, in an embodiment of the present application, the apparatus for detecting cache data includes:
the second judgment module is used for judging whether other task requests to be processed except the cache detection request exist or not;
a third obtaining module, configured to obtain, if there are other task requests to be processed other than the cache detection request, a request number of the other task requests;
the third judging module is used for judging whether the request quantity is greater than a preset quantity threshold value;
the screening module is used for screening a specified number of target task requests from the other task requests according to a preset rule if the number of the requests is greater than the number threshold;
the fourth judging module is used for judging whether a specific task request with unadjustable processing time exists in the target task;
the removing module is used for removing the specific task from the target task request to obtain a removed target task request if the specific task request exists in the target task;
and the adjusting module is used for adjusting the removed target task request to a preset idle time period for processing, wherein the idle time period is different from the processing time period of the cache detection request.
In this embodiment, the implementation processes of the functions and functions of the second determining module, the third obtaining module, the third determining module, the screening module, the fourth determining module, the eliminating module and the adjusting module in the detection apparatus for cache data are specifically detailed in the implementation processes corresponding to steps S200 to S206 in the detection method for cache data, and are not described herein again.
Further, in an embodiment of the present application, the screening module includes:
a first determining unit configured to determine a resource consumption amount requested by each of the other tasks;
the first sequencing unit is used for sequencing all the other task requests according to the sequence of the resource consumption from large to small to obtain a corresponding first sequencing result;
a first obtaining unit, configured to sequentially obtain a plurality of first task requests of which the number is the same as the specified number, starting with other task requests that are ranked first in the first ranking result;
a second determining unit, configured to use all the first task requests as the target task requests.
In this embodiment, the implementation processes of the functions and functions of the first determining unit, the first ordering unit, the first obtaining unit, and the second determining unit in the detection apparatus for cache data are specifically detailed in the implementation processes corresponding to steps S2030 to S2033 in the detection method for cache data, and are not described herein again.
Further, in an embodiment of the present application, the screening module includes:
the second acquiring unit is used for acquiring the processing priority of each other task request based on a preset task request priority table;
the second sequencing unit is used for sequencing all the other task requests according to the sequence of the processing priorities from low to high to obtain a corresponding second sequencing result;
a third obtaining unit, configured to sequentially obtain, starting from other task requests ranked first in the second ranking result, a plurality of second task requests that are the same as the specified number;
a third determining unit, configured to use the second task request as the target task request.
In this embodiment, the implementation processes of the functions and functions of the second obtaining unit, the second sorting unit, the third obtaining unit and the third determining unit in the detection apparatus for cache data are specifically detailed in the implementation processes corresponding to steps S2034 to S2037 in the detection method for cache data, and are not described herein again.
Further, in an embodiment of the present application, the apparatus for detecting cache data includes:
the fourth obtaining module is used for obtaining the writing time of the target cache data; and the number of the first and second groups,
a fifth obtaining module, configured to obtain an expiration duration of the target cache data;
a first calculating module, configured to calculate an expiration time point corresponding to the target cache data based on the write-in time and the expiration time;
a fifth judging module, configured to obtain a current time, and judge whether the current time exceeds the expiration time point;
a clearing module, configured to clear the target cache data if the current time exceeds the expiration time point;
the second calculation module is used for calculating the difference value between the expiration time point and the current time if the current time does not exceed the expiration time point;
and the second generation module is used for generating corresponding overdue reminding information based on the target cache data and the difference value and displaying the overdue reminding information.
In this embodiment, the implementation processes of the functions and functions of the fourth obtaining module, the fifth obtaining module, the first calculating module, the fifth judging module, the clearing module, the second calculating module and the second generating module in the detection apparatus for cached data are specifically detailed in the implementation processes corresponding to steps S900 to S906 in the detection method for cached data, and are not described herein again.
Further, in an embodiment of the present application, the apparatus for detecting cache data includes:
a sixth obtaining module, configured to obtain a pre-stored detection script template based on the cache detection request;
the extraction module is used for analyzing the cache detection request and extracting the project information;
the filling module is used for filling the detection script template by using the project information to obtain the filled detection script template;
and the determining module is used for taking the processed detection script template as the cache detection script.
In this embodiment, the implementation processes of the functions and functions of the sixth obtaining module, the extracting module, the filling module and the determining module in the detection apparatus for cache data are specifically detailed in the implementation processes corresponding to steps S210 to S213 in the detection method for cache data, and are not described herein again.
Further, in an embodiment of the present application, the apparatus for detecting cache data includes:
the third generation module is used for generating alarm information corresponding to the target cache data based on the target cache data;
a seventh obtaining module, configured to obtain preset mail login information and obtain a preset mail address;
the login module is used for logging in a corresponding mail server based on the mail login information;
and the sending module is used for sending the alarm information to the preset mail address through the mail server.
In this embodiment, the implementation processes of the functions and functions of the third generating module, the seventh obtaining module, the login module and the sending module in the detection apparatus for cache data are specifically detailed in the implementation processes corresponding to steps S910 to S913 in the detection method for cache data, and are not described herein again.
Referring to fig. 3, a computer device, which may be a server and whose internal structure may be as shown in fig. 3, is also provided in the embodiment of the present application. The computer device comprises a processor, a memory, a network interface, a display screen, an input device and a database which are connected through a system bus. Wherein the processor of the computer device is designed to provide computing and control capabilities. The memory of the computer device comprises a storage medium and an internal memory. The storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and computer programs in the storage medium to run. The database of the computer equipment is used for storing data such as a cache detection script, an abstract syntax tree, a first key, a second key, a first array, a second array, target cache data and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection. The display screen of the computer equipment is an indispensable image-text output equipment in the computer, and is used for converting digital signals into optical signals so that characters and figures are displayed on the screen of the display screen. The input device of the computer equipment is the main device for information exchange between the computer and the user or other equipment, and is used for transmitting data, instructions, some mark information and the like to the computer. The computer program is executed by a processor to implement a method of detecting cached data.
The processor executes the steps of the method for detecting the cache data:
judging whether an input cache detection request is received, wherein the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request;
the cache detection script is operated, and js codes of the target items are obtained based on the operation logic of the cache detection script;
generating an abstract syntax tree corresponding to the js code;
traversing all objects in the abstract syntax tree, acquiring first keys corresponding to each first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
acquiring second keys corresponding to second cache data subjected to cache clearing processing in the js codes one by one through a preset second acquisition method, and storing all the second keys in a preset second array;
respectively matching the designated key with each second key contained in the second array to detect whether a target key identical to the designated key exists in the second array or not to obtain a corresponding matching result, wherein the designated key is any one of all the first keys contained in the first array, and the matching result comprises matching success or matching failure;
if the matching result is matching failure, judging that the designated cache data corresponding to the designated key is not cleared;
and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key in the second array and the first key in the first array as the third key with failed matching.
Those skilled in the art will appreciate that the structure shown in fig. 3 is only a block diagram of a part of the structure related to the present application, and does not constitute a limitation to the apparatus and the computer device to which the present application is applied.
An embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where when the computer program is executed by a processor, the method for detecting cache data is implemented, and specifically:
judging whether an input cache detection request is received, wherein the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request;
the cache detection script is operated, and js codes of the target items are obtained based on the operation logic of the cache detection script;
generating an abstract syntax tree corresponding to the js code;
traversing all objects in the abstract syntax tree, acquiring first keys corresponding to each first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
acquiring second keys corresponding to second cache data subjected to cache clearing processing in the js codes one by one through a preset second acquisition method, and storing all the second keys in a preset second array;
respectively matching the designated key with each second key contained in the second array to detect whether a target key identical to the designated key exists in the second array or not to obtain a corresponding matching result, wherein the designated key is any one of all the first keys contained in the first array, and the matching result comprises matching success or matching failure;
if the matching result is matching failure, judging that the designated cache data corresponding to the designated key is not cleared;
and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key in the second array and the first key in the first array as the third key with failed matching.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. Any reference to memory, storage, database, or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that includes the element.
The above description is only a preferred embodiment of the present application, and not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application, or which are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A method for detecting cache data, comprising:
judging whether an input cache detection request is received, wherein the cache detection request carries item information corresponding to a target item, and the cache detection request is a request for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
if an input cache detection request is received, acquiring a pre-stored cache detection script corresponding to the item information based on the cache detection request;
the cache detection script is operated, and js codes of the target items are obtained based on the operation logic of the cache detection script;
generating an abstract syntax tree corresponding to the js code;
traversing all objects in the abstract syntax tree, acquiring first keys corresponding to each first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
acquiring second keys corresponding to second cache data subjected to cache clearing processing in the js codes one by one through a preset second acquisition method, and storing all the second keys in a preset second array;
respectively matching the designated key with each second key contained in the second array to detect whether a target key identical to the designated key exists in the second array or not to obtain a corresponding matching result, wherein the designated key is any one of all the first keys contained in the first array, and the matching result comprises matching success or matching failure;
if the matching result is matching failure, judging that the designated cache data corresponding to the designated key is not cleared;
and acquiring and displaying target cache data corresponding to a third key based on the matching result between the second key in the second array and the first key in the first array as the third key with failed matching.
2. The method for detecting cache data according to claim 1, wherein the step of obtaining a pre-stored cache detection script corresponding to the item information based on the cache detection request is preceded by:
judging whether other task requests to be processed except the cache detection request exist or not;
if other task requests to be processed except the cache detection request exist, acquiring the request quantity of the other task requests;
judging whether the request quantity is larger than a preset quantity threshold value or not;
if the number of the requests is larger than the number threshold, screening out a specified number of target task requests from the other task requests according to a preset rule;
judging whether a specific task request with unadjustable processing time exists in the target task;
if the specific task request exists in the target task, the specific task is removed from the target task request to obtain a removed target task request;
and adjusting the removed target task request to a preset idle time period for processing, wherein the idle time period is different from the processing time period of the cache detection request.
3. The method for detecting the cache data according to claim 2, wherein the step of screening out a specified number of target task requests from the other task requests according to a preset rule comprises:
determining a resource consumption amount of each of the other task requests;
sequencing all the other task requests according to the sequence of the resource consumption from large to small to obtain a corresponding first sequencing result;
sequentially acquiring a plurality of first task requests with the same number as the specified number from other task requests ranked at the head in the first ranking result;
all the first task requests are taken as the target task requests.
4. The method for detecting the cache data according to claim 2, wherein the step of screening out a specified number of target task requests from the other task requests according to a preset rule comprises:
acquiring the processing priority of each other task request based on a preset task request priority table;
sequencing all the other task requests according to the sequence of the processing priorities from low to high to obtain a corresponding second sequencing result;
sequentially acquiring a plurality of second task requests with the same number as the specified number from other task requests ranked at the head in the second sequencing result;
and taking the second task request as the target task request.
5. The method according to claim 1, wherein after the step of obtaining and presenting the target cache data corresponding to a third key based on a matching result between the second key included in the first array and the second key included in the second array being the third key with a failed matching, the method comprises:
acquiring the write-in time of the target cache data; and the number of the first and second groups,
acquiring the expiration duration of the target cache data;
calculating an expiration time point corresponding to the target cache data based on the writing time and the expiration time;
acquiring current time, and judging whether the current time exceeds the expiration time point;
if the current time exceeds the expiration time point, clearing the target cache data;
if the current time does not exceed the expiration time point, calculating the difference value between the expiration time point and the current time;
and generating corresponding overdue reminding information based on the target cache data and the difference value, and displaying the overdue reminding information.
6. The method for detecting cache data according to claim 1, wherein the step of obtaining a pre-stored cache detection script corresponding to the item information based on the cache detection request is preceded by:
acquiring a pre-stored detection script template based on the cache detection request;
analyzing the cache detection request, and extracting the project information;
filling the detection script template by using the project information to obtain a filled detection script template;
and taking the processed detection script template as the cache detection script.
7. The method according to claim 1, wherein after the step of obtaining and presenting the target cache data corresponding to a third key based on a matching result between the second key included in the first array and the second key included in the second array being the third key with a failed matching, the method comprises:
generating alarm information corresponding to the target cache data based on the target cache data;
acquiring preset mail login information and acquiring a preset mail address;
logging in to a corresponding mail server based on the mail login information;
and sending the alarm information to the preset mail address through the mail server.
8. A device for detecting buffered data, comprising:
the system comprises a first judgment module, a first cache module and a second judgment module, wherein the first judgment module is used for judging whether an input cache detection request is received, the cache detection request carries item information corresponding to a target item, and the cache detection request is used for detecting cache data which is contained in a local storage and corresponds to the target item and is not subjected to clearing processing;
the first obtaining module is used for obtaining a pre-stored cache detection script corresponding to the item information based on an input cache detection request if the input cache detection request is received;
the second obtaining module is used for running the cache detection script and obtaining the js code of the target item based on the running logic of the cache detection script;
the first generation module is used for generating an abstract syntax tree corresponding to the js code;
the first storage module is used for traversing all objects in the abstract syntax tree, acquiring first keys corresponding to all first cache data contained in the js code one by one through a preset first acquisition method, and storing all the first keys in a preset first array; and the number of the first and second groups,
the second storage module is used for the first judgment module and is used for acquiring second keys which correspond to the second cache data subjected to cache clearing processing in the js code in a one-to-one manner through a preset second acquisition method and storing all the second keys in a preset second array;
the processing module is configured to perform matching processing on an assigned key and each second key included in the second array respectively to detect whether a target key identical to the assigned key exists in the second array, so as to obtain a corresponding matching result, where the assigned key is any one of all first keys included in the first array, and the matching result includes a matching success or a matching failure;
the judging module is used for judging that the designated cache data corresponding to the designated key is not cleared if the matching result is matching failure;
and the display module is used for acquiring and displaying target cache data corresponding to a third key which fails in matching based on a matching result between the second key in the second array and the first key in the first array.
9. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202011479824.3A 2020-12-15 2020-12-15 Cache data detection method and device, computer equipment and storage medium Active CN112540811B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011479824.3A CN112540811B (en) 2020-12-15 2020-12-15 Cache data detection method and device, computer equipment and storage medium
PCT/CN2021/091713 WO2022126984A1 (en) 2020-12-15 2021-04-30 Cache data detection method and apparatus, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011479824.3A CN112540811B (en) 2020-12-15 2020-12-15 Cache data detection method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112540811A true CN112540811A (en) 2021-03-23
CN112540811B CN112540811B (en) 2022-03-18

Family

ID=75018808

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011479824.3A Active CN112540811B (en) 2020-12-15 2020-12-15 Cache data detection method and device, computer equipment and storage medium

Country Status (2)

Country Link
CN (1) CN112540811B (en)
WO (1) WO2022126984A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113626483A (en) * 2021-08-18 2021-11-09 重庆允成互联网科技有限公司 Front-end caching method, system, equipment and storage medium for filling forms
CN113923002A (en) * 2021-09-29 2022-01-11 山石网科通信技术股份有限公司 Computer network intrusion prevention method and device, storage medium and processor
WO2022126984A1 (en) * 2020-12-15 2022-06-23 平安科技(深圳)有限公司 Cache data detection method and apparatus, computer device and storage medium

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115599711B (en) * 2022-11-30 2023-03-10 苏州浪潮智能科技有限公司 Cache data processing method, system, device, equipment and computer storage medium
CN116016261B (en) * 2022-12-26 2024-05-14 广东保伦电子股份有限公司 System operation and maintenance method, device and equipment
CN116112561B (en) * 2023-02-14 2024-02-09 江西数字网联信息安全技术有限公司 Visual management method and system for 3d Internet of vehicles based on web browser cache
CN117390072B (en) * 2023-12-07 2024-03-26 深圳市云希谷科技有限公司 Method for improving network request speed in embedded system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140201838A1 (en) * 2012-01-31 2014-07-17 Db Networks, Inc. Systems and methods for detecting and mitigating threats to a structured data storage system
CN107168872A (en) * 2017-05-11 2017-09-15 网易(杭州)网络有限公司 Method, device, storage medium and the processor of code check
CN108647156A (en) * 2018-04-10 2018-10-12 平安科技(深圳)有限公司 Cache cleaner method, apparatus, computer installation and storage medium
CN111176754A (en) * 2019-12-25 2020-05-19 搜游网络科技(北京)有限公司 HTML5 application running method, device, runner and computer readable storage medium
CN111563216A (en) * 2020-07-16 2020-08-21 平安国际智慧城市科技股份有限公司 Local data caching method and device and related equipment
CN111752975A (en) * 2020-05-28 2020-10-09 中国平安财产保险股份有限公司 Data loading method and device based on Redis, computer equipment and storage medium
CN112035496A (en) * 2020-08-28 2020-12-04 平安科技(深圳)有限公司 Data processing method, related equipment and computer readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9531829B1 (en) * 2013-11-01 2016-12-27 Instart Logic, Inc. Smart hierarchical cache using HTML5 storage APIs
CN104615596B (en) * 2013-11-04 2019-08-13 腾讯科技(深圳)有限公司 The sweep-out method and browser of history information
CN106383748A (en) * 2016-09-05 2017-02-08 Tcl集团股份有限公司 Cloud service-based storage space clearing method and system
CN110688307B (en) * 2019-09-09 2023-11-17 国信金宏信息咨询有限责任公司 JavaScript code detection method, device, equipment and storage medium
CN111897813B (en) * 2020-07-08 2022-09-23 苏宁金融科技(南京)有限公司 Flow control method and device for database resources
CN112540811B (en) * 2020-12-15 2022-03-18 平安科技(深圳)有限公司 Cache data detection method and device, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140201838A1 (en) * 2012-01-31 2014-07-17 Db Networks, Inc. Systems and methods for detecting and mitigating threats to a structured data storage system
CN107168872A (en) * 2017-05-11 2017-09-15 网易(杭州)网络有限公司 Method, device, storage medium and the processor of code check
CN108647156A (en) * 2018-04-10 2018-10-12 平安科技(深圳)有限公司 Cache cleaner method, apparatus, computer installation and storage medium
CN111176754A (en) * 2019-12-25 2020-05-19 搜游网络科技(北京)有限公司 HTML5 application running method, device, runner and computer readable storage medium
CN111752975A (en) * 2020-05-28 2020-10-09 中国平安财产保险股份有限公司 Data loading method and device based on Redis, computer equipment and storage medium
CN111563216A (en) * 2020-07-16 2020-08-21 平安国际智慧城市科技股份有限公司 Local data caching method and device and related equipment
CN112035496A (en) * 2020-08-28 2020-12-04 平安科技(深圳)有限公司 Data processing method, related equipment and computer readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022126984A1 (en) * 2020-12-15 2022-06-23 平安科技(深圳)有限公司 Cache data detection method and apparatus, computer device and storage medium
CN113626483A (en) * 2021-08-18 2021-11-09 重庆允成互联网科技有限公司 Front-end caching method, system, equipment and storage medium for filling forms
CN113923002A (en) * 2021-09-29 2022-01-11 山石网科通信技术股份有限公司 Computer network intrusion prevention method and device, storage medium and processor
CN113923002B (en) * 2021-09-29 2024-04-19 山石网科通信技术股份有限公司 Computer network intrusion prevention method, device, storage medium and processor

Also Published As

Publication number Publication date
WO2022126984A1 (en) 2022-06-23
CN112540811B (en) 2022-03-18

Similar Documents

Publication Publication Date Title
CN112540811B (en) Cache data detection method and device, computer equipment and storage medium
US10467316B2 (en) Systems and methods for web analytics testing and web development
CN110752969B (en) Performance detection method, device, equipment and medium
CN111475370A (en) Operation and maintenance monitoring method, device and equipment based on data center and storage medium
CN108959337A (en) Big data acquisition methods, device, equipment and storage medium
CN113326081A (en) Static resource processing method and device, computer equipment and storage medium
CN112035437B (en) Transmission method and device for medical records data, computer equipment and storage medium
CN112613067A (en) User behavior data acquisition method and device, computer equipment and storage medium
CN112668041A (en) Document file generation method and device, computer equipment and storage medium
CN113918526A (en) Log processing method and device, computer equipment and storage medium
CN112597158A (en) Data matching method and device, computer equipment and storage medium
CN113742776A (en) Data verification method and device based on biological recognition technology and computer equipment
CN114595127A (en) Log exception handling method, device, equipment and storage medium
CN116112194A (en) User behavior analysis method and device, electronic equipment and computer storage medium
CN111880921A (en) Job processing method and device based on rule engine and computer equipment
CN112965981B (en) Data checking method, device, computer equipment and storage medium
CN114237886A (en) Task processing method and device, computer equipment and storage medium
CN113626285A (en) Model-based job monitoring method and device, computer equipment and storage medium
CN113986581A (en) Data aggregation processing method and device, computer equipment and storage medium
CN113672654A (en) Data query method and device, computer equipment and storage medium
CN114398441B (en) Data export method, device, computer equipment and storage medium
CN112650659B (en) Buried point setting method and device, computer equipment and storage medium
CN113535260B (en) Simulator-based data processing method, device, equipment and storage medium
CN113191146B (en) Appeal data distribution method and device, computer equipment and storage medium
CN115168509A (en) Processing method and device of wind control data, storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant