CN117056629A - Cache configuration method, device, computer equipment and storage medium - Google Patents

Cache configuration method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117056629A
CN117056629A CN202311038604.0A CN202311038604A CN117056629A CN 117056629 A CN117056629 A CN 117056629A CN 202311038604 A CN202311038604 A CN 202311038604A CN 117056629 A CN117056629 A CN 117056629A
Authority
CN
China
Prior art keywords
page
target
information
image
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311038604.0A
Other languages
Chinese (zh)
Inventor
韦金记
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Property and Casualty Insurance Company of China Ltd
Original Assignee
Ping An Property and Casualty Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Property and Casualty Insurance Company of China Ltd filed Critical Ping An Property and Casualty Insurance Company of China Ltd
Priority to CN202311038604.0A priority Critical patent/CN117056629A/en
Publication of CN117056629A publication Critical patent/CN117056629A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application belongs to the field of artificial intelligence and financial science and technology, and relates to a cache configuration method, which comprises the following steps: if a cache configuration request of the target application is received, a third party library is installed in a reaction item of the target application, and an information configuration page is displayed; receiving attribute information, target page information and target skip page information input by a user in an information configuration page; adding a target attribute to the route configuration of the target application based on the attribute information; introducing a first target component and a second target component into a third party library; wrapping the target application based on the first target component and wrapping the target page information based on the second target component; and generating and configuring a page clearing rule based on the target jump page information. The application also provides a cache configuration device, computer equipment and a storage medium. In addition, the attribute information of the present application may be stored in a blockchain. The application can be applied to the cache configuration scene in the financial field, and improves the processing efficiency of page cache configuration.

Description

Cache configuration method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of artificial intelligence development technology and financial technology, and in particular, to a cache configuration method, a cache configuration device, a computer device, and a storage medium.
Background
With the rapid development of science and technology, application programs are becoming more and more popular. For finance and technology companies, such as insurance companies or banks, applications serving their own business are usually developed, and the practice project is often used in these applications. In the React project, since the React framework itself does not support the keep-alive cache function. A scenario is typically encountered: the user does not find the wanted data (such as data of third page) from the first page in the list A, turns pages to the second page, finds the data of third page, clicks the data details of third page, clicks to return to the list A, and then finds that the page number has become the first page, turns pages again to the second page, and can find third page, namely, jumps from the first page to the second page, and the list A is not cached. This can make the user interaction with the application cumbersome and less user friendly. There is no cache problem for this existing page. In the existing processing mode, data in pages are stored in a storage mode in an application project, whether the data in the storage exists or not is judged when the pages are loaded, if yes, the data in the storage is loaded, otherwise, the data are directly obtained from an interface and re-rendered, but the processing mode needs to process each page once, namely corresponding storage mode codes need to be written for each page, so that the workload of cache configuration of the application is high, and the efficiency of the cache configuration of the application is low.
Disclosure of Invention
The embodiment of the application aims to provide a cache configuration method, a device, computer equipment and a storage medium, so as to solve the technical problems that the existing processing mode of the cache configuration of an application needs to be processed once for each page, namely, corresponding storage mode codes need to be written for each page, so that the workload of the cache configuration of the application is large and the efficiency of the cache configuration of the application is low.
In order to solve the above technical problems, an embodiment of the present application provides a cache configuration method, which adopts the following technical scheme:
judging whether a cache configuration request for a target application triggered by a user is received or not;
if yes, a third party library is installed in a reaction project of the target application, and a preset information configuration page is displayed;
receiving attribute information input by the user on the information configuration page, target page information and target skip page information corresponding to the target page information;
adding a target attribute corresponding to the attribute information in the routing configuration of the target application based on the attribute information;
introducing a first target component and a second target component into the third party library;
Performing package processing on the target application based on the first target component, and performing package processing on the target page information based on the second target component;
and generating a page clearing rule based on the target jump page information, and configuring the clearing rule in the target application.
Further, after the step of generating a page clearing rule based on the target jump page information and configuring the clearing rule in the target application, the method further includes:
when a target user accesses a first page of the target application, receiving a page jump request triggered by the target user in the first page;
recording a page path of a second page to which the first page is going;
loading a second page corresponding to the page path;
acquiring a specified attribute corresponding to the first page based on the clearing rule;
acquiring appointed skip page information corresponding to the second target component from the appointed attribute;
judging whether the page path is contained in the designated jump page information;
and if the page path is not included, calling a preset clearing method to perform clearing cache processing corresponding to the second page.
Further, the step of loading the second page corresponding to the page path specifically includes:
text data are obtained by text extraction of the second page, and whether sensitive words exist in the text data in the second page is detected;
if no sensitive word exists, image extraction is carried out on the second page to obtain image data, and whether illegal images exist in the image data in the second page is detected;
and if the illegal image does not exist, loading a second page corresponding to the page path.
Further, the step of performing text extraction on the second page to obtain text data and detecting whether a sensitive word exists in the text data in the second page specifically includes:
performing text extraction on the second page to obtain text data in the second page;
word segmentation processing is carried out on the text data to obtain a plurality of corresponding words;
acquiring a preset sensitive word set;
respectively carrying out matching processing on each word and the sensitive keywords in the sensitive word set, and judging whether specified words successfully matched with the sensitive keywords exist in all the words;
If the appointed word successfully matched exists, judging that the sensitive word exists in the text data in the second page, otherwise, judging that the sensitive word does not exist in the text data in the second page.
Further, after the step of performing text extraction on the second page to obtain text data and detecting whether the text data in the second page has sensitive words, the method further includes:
if the text data in the second page contains sensitive words, loading the second page is limited;
acquiring preset page error information, and displaying the page error information in the first page;
generating processing alarm information corresponding to the sensitive word;
acquiring communication information of operation and maintenance personnel;
and pushing the processing alarm information to the communication personnel based on the communication information.
Further, the step of extracting the image of the second page to obtain image data and detecting whether the image data in the second page has a violation image specifically includes:
extracting an image of the second page to obtain image data in the second page;
calling a preset image detection model;
Inputting the image into the image detection model, and detecting the violations of the image data through the image detection model to generate a violating detection result corresponding to the image data;
and identifying whether a violation image exists in the image data in the second page based on the violation detection result.
Further, after the step of extracting the image of the second page to obtain image data and detecting whether the image data in the second page has a violation image, the method further includes:
if the image data in the second page has illegal images, acquiring a preset image template;
acquiring the position information of the violation image in the second page;
based on the position information, performing coverage processing on the illegal images in the second page by using the image template to obtain a target image;
and loading the second page.
In order to solve the above technical problems, the embodiment of the present application further provides a cache configuration device, which adopts the following technical scheme:
the first judging module is used for judging whether a buffer configuration request for a target application triggered by a user is received or not;
the installation module is used for installing a third party library in a reaction project of the target application if yes, and displaying a preset information configuration page;
The first receiving module is used for receiving attribute information, target page information and target skip page information corresponding to the target page information which are input by the user on the information configuration page;
an adding module, configured to add, based on the attribute information, a target attribute corresponding to the attribute information in a routing configuration of the target application;
the first processing module is used for introducing a first target component and a second target component into the third party library;
the second processing module is used for carrying out package processing on the target application based on the first target component and carrying out package processing on the target page information based on the second target component;
and the configuration module is used for generating a page clearing rule based on the target skip page information and configuring the clearing rule in the target application.
In order to solve the above technical problems, the embodiment of the present application further provides a computer device, which adopts the following technical schemes:
judging whether a cache configuration request for a target application triggered by a user is received or not;
if yes, a third party library is installed in a reaction project of the target application, and a preset information configuration page is displayed;
Receiving attribute information input by the user on the information configuration page, target page information and target skip page information corresponding to the target page information;
adding a target attribute corresponding to the attribute information in the routing configuration of the target application based on the attribute information;
introducing a first target component and a second target component into the third party library;
performing package processing on the target application based on the first target component, and performing package processing on the target page information based on the second target component;
and generating a page clearing rule based on the target jump page information, and configuring the clearing rule in the target application.
In order to solve the above technical problems, an embodiment of the present application further provides a computer readable storage medium, which adopts the following technical schemes:
judging whether a cache configuration request for a target application triggered by a user is received or not;
if yes, a third party library is installed in a reaction project of the target application, and a preset information configuration page is displayed;
receiving attribute information input by the user on the information configuration page, target page information and target skip page information corresponding to the target page information;
Adding a target attribute corresponding to the attribute information in the routing configuration of the target application based on the attribute information;
introducing a first target component and a second target component into the third party library;
performing package processing on the target application based on the first target component, and performing package processing on the target page information based on the second target component;
and generating a page clearing rule based on the target jump page information, and configuring the clearing rule in the target application.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
the embodiment of the application firstly judges whether a buffer configuration request for a target application triggered by a user is received; if yes, a third party library is installed in a reaction project of the target application, and a preset information configuration page is displayed; then receiving attribute information input by the user on the information configuration page, target page information and target skip page information corresponding to the target page information; then, based on the attribute information, adding a target attribute corresponding to the attribute information in the routing configuration of the target application; subsequently, a first target component and a second target component are introduced into the third party library, the target application is subjected to package processing based on the first target component, and the target page information is subjected to package processing based on the second target component; and finally, generating a page clearing rule based on the target skip page information, and configuring the clearing rule in the target application. According to the cache configuration method provided by the embodiment of the application, the cache function of the page of the target application can be realized by only adding the attribute of the route configuration of the target application and introducing the components into the third party library according to the attribute information, the target page information and the target skip page information input by the user in the information configuration page, and further carrying out package processing on the target application and the page needing to be cached based on the introduced components, so that corresponding storage mode code writing is not needed to be carried out on each page needing to be cached in the target application, the workload of page cache configuration is effectively reduced, and the processing efficiency of page cache configuration is improved. In addition, a page clearing rule is generated based on the target jump page information, and the clearing rule is configured in the target application, so that page cache optimization of the target application is realized based on personalized requirements of a user, occupation space of the page cache is reduced, and processing intelligence of the page cache is improved.
Drawings
In order to more clearly illustrate the solution of the present application, a brief description will be given below of the drawings required for the description of the embodiments of the present application, it being apparent that the drawings in the following description are some embodiments of the present application, and that other drawings may be obtained from these drawings without the exercise of inventive effort for a person of ordinary skill in the art.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a cache configuration method according to the present application;
FIG. 3 is a schematic diagram illustrating the structure of one embodiment of a cache configuration apparatus according to the present application;
FIG. 4 is a schematic structural diagram of one embodiment of a computer device in accordance with the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the applications herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion. The terms first, second and the like in the description and in the claims or in the above-described figures, are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In order to make the person skilled in the art better understand the solution of the present application, the technical solution of the embodiment of the present application will be clearly and completely described below with reference to the accompanying drawings.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as a web browser application, a shopping class application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103.
It should be noted that, the cache configuration method provided by the embodiment of the present application is generally executed by a server/terminal device, and accordingly, the cache configuration device is generally disposed in the server/terminal device.
The embodiment of the application can acquire and process the related data based on the artificial intelligence technology. Among these, artificial intelligence (Artificial Intelligence, AI) is the theory, method, technique and application system that uses a digital computer or a digital computer-controlled machine to simulate, extend and extend human intelligence, sense the environment, acquire knowledge and use knowledge to obtain optimal results.
Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow chart of one embodiment of a cache configuration method in accordance with the present application is shown. The order of the steps in the flowchart may be changed and some steps may be omitted according to various needs. The cache configuration method provided by the embodiment of the application can be applied to any scene needing page cache configuration, and can be applied to products of the scenes, such as page cache configuration of financial insurance application in the field of financial insurance. The cache configuration method comprises the following steps:
Step S201, determining whether a user-triggered cache configuration request for a target application is received.
In this embodiment, the electronic device (e.g., the server/terminal device shown in fig. 1) on which the cache configuration method operates may obtain the cache configuration request through a wired connection manner or a wireless connection manner. It should be noted that the wireless connection may include, but is not limited to, 3G/4G/5G connection, wiFi connection, bluetooth connection, wiMAX connection, zigbee connection, UWB (ultra wideband) connection, and other now known or later developed wireless connection. In the application scenario in the field of financial science and technology, the target application may specifically be an insurance application, a banking application, a transaction application, a payment application, a medical application, and so on. The user may trigger a cache configuration request for the target application by clicking on the cache configuration button for that target application.
Step S202, if yes, a third party library is installed in a reaction item of the target application, and a preset information configuration page is displayed.
In this embodiment, the third party library is specifically a reaction-activation library. Wherein, the act item itself does not support keep-alive caching.
Step S203, receiving attribute information, target page information and target skip page information corresponding to the target page information, which are input by the user on the information configuration page.
In this embodiment, the attribute information refers to meta attribute, and may include, for example, const routes= [ { name: 'mall management', path: '/foods-management', components: '/supplier/goodsManage', meta { keep alive { toPath: '/foods-management/discovery' } ]. The target page information refers to a page to be cached in the target application. The target skip page information refers to page information corresponding to a page to be cached when the page to be cached is skipped to a specified page.
Step S204, adding a target attribute corresponding to the attribute information in the routing configuration of the target application based on the attribute information.
In this embodiment, the route configuration of the target application may be obtained first, and then the target attribute corresponding to the attribute information may be added to the route configuration of the target application, that is, the meta attribute may be added to the route configuration of the target application.
Step S205, introducing a first target component and a second target component into the third party library.
In this embodiment, the first target component is specifically an alive scope, and the second target component is specifically a KeepAlive. The first target component and the second target component may be incorporated in the third party library by being in a heel component of the target application.
Step S206, performing a parcel processing on the target application based on the first target component, and performing a parcel processing on the target page information based on the second target component.
In this embodiment, the whole target application may be wrapped by < alive scope/>, and the page to be cached in the target application may be wrapped by < KeepAlive/>, so that keep-alive caching of the page of the target application may be realized.
Step S207, generating a page clearing rule based on the target skip page information, and configuring the clearing rule in the target application.
In this embodiment, although the above steps implement keep-alive caching of pages of the target application, any page in the target application is cached when the page jumps, and thus, the personalized requirement is not met, and the personalized requirement refers to that the page is cached when the page jumps to the specified page. Page-purge rules are thus generated based on the target-hop page information for further cache optimization for the target application. The content of the clearing rule includes: if the path of the page jump is exactly the toPath of keepAlive under meta attribute in its own routing configuration, no operation is done (because this page is already wrapped by < keepAlive/>, already in the cached state) and if the jump path is not the toPath of keepAlive under meta attribute in its own routing configuration, the clear caching process of the page is performed using the clear method under useAlivController provided in the third party library react-action.
According to the cache configuration method provided by the application, the attribute of the route configuration of the target application is increased and the components are introduced into the third party library only according to the attribute information, the target page information and the target skip page information input by the user in the information configuration page, and the cache function of the page of the target application can be realized by carrying out package processing on the target application and the page needing to be cached based on the introduced components, so that the corresponding storage mode code writing on each page needing to be cached in the target application is not needed, the workload of page cache configuration is effectively reduced, and the processing efficiency of page cache configuration is improved. In addition, a page clearing rule is generated based on the target jump page information, and the clearing rule is configured in the target application, so that page cache optimization of the target application is realized based on personalized requirements of a user, occupation space of the page cache is reduced, and processing intelligence of the page cache is improved.
In some alternative implementations, after step S207, the electronic device may further perform the following steps:
and when the target user accesses a first page of the target application, receiving a page jump request triggered by the target user in the first page.
In this embodiment, the first page may be any page of the target application, for example, may refer to a first page, a setting page, and the like of the target application.
And recording the page path of the second page to which the first page is going.
In this embodiment, when the target user triggers a page jump request in the first page, the processing of recording the page path of the second page to which the first page is going may be triggered at the same time. In the security application scenario, the target application is a security application, and the second page may be a security product introduction page, an insurance consultation application page, an insurance payment application page, or the like in the security application.
And loading a second page corresponding to the page path.
In this embodiment, the foregoing specific implementation process of loading the second page corresponding to the page path will be described in further detail in the following specific embodiments, which will not be described herein.
And acquiring the designated attribute corresponding to the first page based on the clearing rule.
In this embodiment, the above specified attribute refers to the meta attribute of the first page.
And acquiring the designated jump page information corresponding to the second target component from the designated attribute.
In this embodiment, the information about the designated page may be information input according to the service usage requirement actually cached when the designated page is jumped to by the first page.
And judging whether the page path is contained in the designated jump page information.
In this embodiment, the information matching may be performed between the page path and the designated hop page information, so as to determine whether the page path is included in the designated hop page information.
And if the page path is not included, calling a preset clearing method to perform clearing cache processing corresponding to the second page.
In this embodiment, the above-mentioned clearing method refers to a clear method for clearing the cache under the usealivery controller provided in the third party library reaction-activation.
When a target user accesses a first page of the target application, receiving a page jump request triggered by the target user in the first page; then recording a page path of a second page to which the first page is going, and loading the second page corresponding to the page path; then, based on the clearing rule, acquiring a designated attribute corresponding to the first page; subsequently, acquiring appointed skip page information corresponding to the second target component from the appointed attribute; finally judging whether the specified jump page information contains the page path or not; and if the page path is not included, calling a preset clearing method to perform clearing cache processing corresponding to the second page. When the page jump request triggered by the target user in the first page is detected, whether the page path of the second page which is going to be sent by the first page is contained in the designated jump page information of the second target component stored in the designated attribute of the first page or not can be intelligently judged based on the use of the clearing rule, when the fact that the page path is not contained in the designated jump page information is determined, a preset clearing method is intelligently called to carry out clearing cache processing corresponding to the second page, namely, page caching and retaining can be carried out only when the page which is jumped by the first page belongs to the matched designated page, and when the page which is jumped by the first page is not the matched designated page, the clearing method is used to carry out clearing cache processing of the page. Therefore, the optimization of the page buffer of the first page of the target application can be effectively realized, the workload and occupied space of the page buffer are reduced, and the processing intelligence of the page buffer is improved.
In some optional implementations of this embodiment, the loading a second page corresponding to the page path includes the steps of:
and extracting the characters of the second page to obtain text data, and detecting whether sensitive words exist in the text data in the second page.
In this embodiment, the text data is obtained by performing text extraction on the second page, and a specific implementation process of detecting whether a sensitive word exists in the text data in the second page is detected.
And if the sensitive word does not exist, carrying out image extraction on the second page to obtain image data, and detecting whether the image data in the second page has illegal images or not.
In this embodiment, the specific implementation process of extracting the image from the second page to obtain the image data and detecting whether the image data in the second page has the offending image is described in further detail in the following specific embodiment, which will not be described herein.
And if the illegal image does not exist, loading a second page corresponding to the page path.
Text data is obtained by carrying out text extraction on the second page, and whether sensitive words exist in the text data in the second page is detected; if no sensitive word exists, image extraction is carried out on the second page to obtain image data, and whether illegal images exist in the image data in the second page is detected; and if the illegal image does not exist, loading a second page corresponding to the page path. Before loading the second page corresponding to the page path, the method intelligently detects the sensitive words and the illegal images of the second page, and only when the second page is detected to have no sensitive words and illegal images, the second page is loaded and displayed later, so that the data normalization and accuracy of the page to be displayed in the target application are effectively ensured, and the page consulting experience of the user is guaranteed.
In some optional implementations, the text data is obtained by performing text extraction on the second page, and whether the sensitive word exists in the text data in the second page is detected, including the following steps:
and extracting the text of the second page to obtain text data in the second page.
In this embodiment, an existing text extraction algorithm may be used to perform text extraction on the second page, so as to obtain text data in the second page.
And performing word segmentation processing on the text data to obtain a plurality of corresponding words.
In this embodiment, a word segmentation device, for example, a jieba word segmentation device, may be used to perform word segmentation on the text data, so as to obtain a plurality of corresponding words.
And acquiring a preset sensitive word set.
In this embodiment, the set of sensitive words is a set of a plurality of sensitive words that are screened and determined in advance according to actual service usage requirements and are common in the application service. For different service scenes, sensitive word sets matched with the different service scenes respectively are preset. Illustratively, a set of sensitive words corresponding to the vehicle insurance business scenario is pre-constructed, a set of sensitive words corresponding to the health insurance business scenario is constructed, a set of sensitive words corresponding to the medical business scenario is constructed, and so on.
And respectively carrying out matching processing on each word and the sensitive keywords in the sensitive word set, and judging whether specified words successfully matched with the sensitive keywords exist in all the words.
In this embodiment, each word in the text data may be matched with each sensitive keyword in the sensitive word set by using a data parallel matching instruction, so as to improve the processing efficiency of word matching.
If the appointed word successfully matched exists, judging that the sensitive word exists in the text data in the second page, otherwise, judging that the sensitive word does not exist in the text data in the second page.
In this embodiment, if there is no specified term successfully matched with the sensitive keyword in all the terms, it is determined that there is no sensitive term in the text data in the second page
Text data in the second page is obtained by extracting the text of the second page; then word segmentation processing is carried out on the text data to obtain a plurality of corresponding words; then acquiring a preset sensitive word set; subsequently, each word is matched with a sensitive keyword in the sensitive word set respectively, and whether specified words successfully matched with the sensitive keywords exist in all the words is judged; if the appointed word successfully matched exists, judging that the sensitive word exists in the text data in the second page, otherwise, judging that the sensitive word does not exist in the text data in the second page. Before loading the second page corresponding to the page path, the method intelligently uses the sensitive word set to detect the sensitive words of the second page, and only when the fact that the sensitive words do not exist in the second page is detected, the second page is loaded and displayed later, so that the data normalization and accuracy of the page to be displayed in the target application are effectively ensured, and the page consulting experience of a user is guaranteed.
In some optional implementations, after the step of performing text extraction on the second page to obtain text data and detecting whether there is a sensitive word in the text data in the second page, the electronic device may further perform the following steps:
and if the sensitive words exist in the text data in the second page, limiting loading the second page.
In this embodiment, limiting the loading of the second page refers to not exposing the second page on the target application.
Acquiring preset page error information, and displaying the page error information in the first page.
In this embodiment, the page fault information is a page fault information created in advance according to an actual fault display requirement. The content of the page fault information is not limited, and may be, for example, "the current jump page fails and cannot be viewed temporarily".
And generating processing alarm information corresponding to the sensitive word.
In this embodiment, the alert template may be obtained, and then the sensitive words in the text data may be filled into the alert template to generate the corresponding processed alert information. The alarm template is a template file which is created in advance according to the actual alarm processing requirement.
And acquiring communication information of operation and maintenance personnel.
In this embodiment, the above-mentioned operation and maintenance personnel may refer to operation and maintenance personnel of the target application. The communication information may include mail addresses, telephone numbers, etc.
And pushing the processing alarm information to the communication personnel based on the communication information.
After detecting that sensitive words exist in text data in the second page, the method limits loading the second page; then acquiring preset page error information, and displaying the page error information in the first page; then generating processing alarm information corresponding to the sensitive word; subsequently, the communication information of operation and maintenance personnel is obtained; and finally, pushing the alarm processing information to the communication personnel based on the communication information. According to the method and the device for loading the second page, when the sensitive words exist in the text data in the second page, loading of the second page is limited, and the problem that experience is poor for a user due to the fact that the page with the poor sensitive words is displayed can be effectively avoided. And processing alarm information corresponding to the sensitive words can be intelligently generated, and the processing alarm information is pushed to the communication personnel based on communication information of operation and maintenance personnel, so that the operation and maintenance personnel can be reminded of carrying out maintenance processing on modification of the sensitive words existing in the second page based on the processing alarm information rapidly and conveniently, the processing efficiency of page operation and maintenance is improved, and the normal display rate of the subsequent second page is improved.
In some optional implementations of this embodiment, the image extracting the second page to obtain image data, and detecting whether there is a violation image in the image data in the second page includes the following steps:
and extracting the image of the second page to obtain the image data in the second page.
In this embodiment, the image detection may be performed on the second page by using an existing image detection algorithm to determine an image in the second page, and then the image in the second page is extracted to obtain the image data in the second page.
And calling a preset image detection model.
In this embodiment, the training generation process of the image detection model includes: acquiring a certain number of sample images acquired in advance, and dividing the sample images into a training set and a verification set according to a preset proportion; wherein the sample image refers to an image marked with offensive content; training a pre-constructed learning model by using a training set to obtain a trained learning model; subsequently, the trained learning model is verified by using a verification sample set, and a verification result of the trained learning model is obtained; and if the verification result meets the preset expected condition, taking the trained initial model as the image detection model. The verification result includes a verification success rate, and the preset expected condition means that the verification success rate is greater than a preset threshold value, and the value of the preset threshold value is not particularly limited and can be set according to actual requirements. In addition, if the verification result does not meet the preset expected condition, that is, if the verification success rate is smaller than the preset threshold, model parameters of a pre-built learning model are further adjusted according to the verification result, the learning model is optimized, and then the model training step is re-executed until the verification result meets the preset condition. The selection of the learning model is not particularly limited, and may be, for example, a learning model based on an object detection algorithm.
And inputting the image into the image detection model, and detecting the violation of the image data through the image detection model to generate a violation detection result corresponding to the image data.
In this embodiment, after an image is input to the image detection model, the image is subjected to violation detection by the image detection model, so as to generate a confidence level for representing that the image belongs to a violation image, that is, the violation detection result. Confidence can be understood as the accuracy with which the image output by the image detection model belongs to the offending image.
And identifying whether a violation image exists in the image data in the second page based on the violation detection result.
In this embodiment, if the above-mentioned violation detection result, that is, the confidence coefficient is greater than the preset numerical threshold, the image is characterized as belonging to the violation image, and it is determined that the violation image exists in the image data in the second page. And if the confidence coefficient is smaller than the preset numerical threshold, characterizing that the image does not belong to the illegal image, and judging that the illegal image exists in the image data in the second page. The value of the numerical threshold is not particularly limited, and may be set according to actual use requirements.
The image data in the second page are obtained by carrying out image extraction on the second page; then calling a preset image detection model; inputting the image into the image detection model, and detecting the violations of the image data through the image detection model to generate a violating detection result corresponding to the image data; and then identifying whether a violation image exists in the image data in the second page based on the violation detection result. Before loading the second page corresponding to the page path, the method and the device intelligently use the image detection model to detect the illegal image of the second page, and load and display the second page only when detecting that the illegal image does not exist in the second page, thereby effectively ensuring the data normalization and accuracy of the page to be displayed in the target application and being beneficial to ensuring the page consulting experience of the user.
In some optional implementations of this embodiment, after the step of extracting the image of the second page to obtain image data, and detecting whether there is a violation image in the image data of the second page, the electronic device may further execute the following steps:
And if the image data in the second page has the illegal image, acquiring a preset image template.
In this embodiment, the image template may be a blank image or a gray image.
And acquiring the position information of the violation image in the second page.
In this embodiment, the above-mentioned position information may be a coordinate position of the violation image in the second page.
And based on the position information, performing coverage processing on the illegal image in the second page by using the image template to obtain a target image.
In this embodiment, the above-described covering process may refer to covering the offending image in the image template.
And loading the second page.
In this embodiment, the second page to be subsequently loaded is a page subjected to overlay processing on the offending image in the second page using the image template.
After detecting that the illegal image exists in the image data in the second page, acquiring a preset image template; then, acquiring the position information of the violation image in the second page; then, based on the position information, performing coverage processing on the illegal image in the second page by using the image template to obtain a target image; and subsequently loading the second page. After detecting that the illegal image exists in the image data in the second page, the method intelligently uses the preset image template to carry out covering processing on the illegal image in the second page, and then loads the second page subjected to covering processing, so that the problem of poor user experience caused by displaying the page with the bad illegal image can be effectively avoided, the data normalization and accuracy of the page needing to be displayed in the target application are effectively ensured, and the page consulting experience of the user is guaranteed.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
It should be emphasized that, to further ensure the privacy and security of the attribute information, the attribute information may also be stored in a node of a blockchain.
The blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, an application services layer, and the like.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by computer readable instructions stored in a computer readable storage medium that, when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a nonvolatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
With further reference to fig. 3, as an implementation of the method shown in fig. 2, the present application provides an embodiment of a cache configuration apparatus, where an embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be specifically applied to various electronic devices.
As shown in fig. 3, the cache configuration apparatus 300 according to this embodiment includes: a first judging module 301, an installing module 302, a first receiving module 303, an adding module 304, a first processing module 305, a second processing module 306 and a configuring module 307. Wherein:
A first determining module 301, configured to determine whether a cache configuration request for a target application triggered by a user is received;
the installation module 302 is configured to install a third party library in a reaction item of the target application if yes, and display a preset information configuration page;
a first receiving module 303, configured to receive attribute information, target page information and target skip page information corresponding to the target page information, which are input by the user on the information configuration page;
an adding module 304, configured to add, based on the attribute information, a target attribute corresponding to the attribute information in a routing configuration of the target application;
a first processing module 305, configured to introduce a first target component and a second target component into the third party library;
a second processing module 306, configured to perform a parcel processing on the target application based on the first target component, and perform a parcel processing on the target page information based on the second target component;
a configuration module 307, configured to generate a page clearing rule based on the target skip page information, and configure the clearing rule in the target application.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the cache configuration apparatus further includes:
the second receiving module is used for receiving a page jump request triggered by a target user in a first page of the target application when the target user accesses the first page;
the recording module is used for recording the page path of the second page which is about to go to by the first page;
the loading module is used for loading a second page corresponding to the page path;
the first acquisition module is used for acquiring the appointed attribute corresponding to the first page based on the clearing rule;
the second acquisition module is used for acquiring the designated jump page information corresponding to the second target component from the designated attribute;
the second judging module is used for judging whether the page path is contained in the appointed jump page information;
and the third processing module is used for calling a preset clearing method to perform clearing cache processing corresponding to the second page if the page path is not included.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of the present embodiment, the loading module includes:
the first detection sub-module is used for carrying out text extraction on the second page to obtain text data and detecting whether sensitive words exist in the text data in the second page;
the second detection sub-module is used for extracting the image of the second page to obtain image data if the sensitive word does not exist, and detecting whether the image data in the second page has illegal images or not;
and the first loading sub-module is used for loading a second page corresponding to the page path if the illegal image does not exist.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the first detection submodule includes:
the first extraction unit is used for extracting the characters of the second page to obtain text data in the second page;
the word segmentation unit is used for carrying out word segmentation processing on the text data to obtain a plurality of corresponding words;
the acquisition unit is used for acquiring a preset sensitive word set;
The judging unit is used for respectively carrying out matching processing on each word and the sensitive keywords in the sensitive word set and judging whether specified words successfully matched with the sensitive keywords exist in all the words or not;
and the judging unit is used for judging that the text data in the second page has sensitive words if the appointed words successfully matched exist, and judging that the text data in the second page does not have the sensitive words if the appointed words successfully matched exist.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the loading module further includes:
the first processing sub-module is used for limiting loading the second page if sensitive words exist in the text data in the second page;
the first acquisition sub-module is used for acquiring preset page error information and displaying the page error information in the first page;
a generation sub-module for generating processing alarm information corresponding to the sensitive word;
the second acquisition sub-module is used for acquiring communication information of operation and maintenance personnel;
And the pushing sub-module is used for pushing the alarm processing information to the communication personnel based on the communication information.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the second detection sub-module includes:
the second extraction unit is used for extracting the image of the second page to obtain image data in the second page;
the calling unit is used for calling a preset image detection model;
the detection unit is used for inputting the image into the image detection model, detecting the violations of the image data through the image detection model and generating violations detection results corresponding to the image data;
and the identification unit is used for identifying whether the image data in the second page has a violation image or not based on the violation detection result.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the loading module further includes:
A third obtaining sub-module, configured to obtain a preset image template if the image data in the second page has a violation image;
a fourth obtaining sub-module, configured to obtain location information of the violation image in the second page;
the second processing sub-module is used for performing coverage processing on the illegal images in the second page by using the image template based on the position information to obtain a target image;
and the second loading sub-module is used for loading the second page.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the cache configuration method in the foregoing embodiment one by one, which is not described herein again.
In order to solve the technical problems, the embodiment of the application also provides computer equipment. Referring specifically to fig. 4, fig. 4 is a basic structural block diagram of a computer device according to the present embodiment.
The computer device 4 comprises a memory 41, a processor 42, a network interface 43 communicatively connected to each other via a system bus. It should be noted that only computer device 4 having components 41-43 is shown in the figures, but it should be understood that not all of the illustrated components are required to be implemented and that more or fewer components may be implemented instead. It will be appreciated by those skilled in the art that the computer device herein is a device capable of automatically performing numerical calculations and/or information processing in accordance with predetermined or stored instructions, the hardware of which includes, but is not limited to, microprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASICs), programmable gate arrays (fields-Programmable Gate Array, FPGAs), digital processors (Digital Signal Processor, DSPs), embedded devices, etc.
The computer equipment can be a desktop computer, a notebook computer, a palm computer, a cloud server and other computing equipment. The computer equipment can perform man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
The memory 41 includes at least one type of readable storage medium including flash memory, hard disk, multimedia card, card memory (e.g., SD or DX memory, etc.), random Access Memory (RAM), static Random Access Memory (SRAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), programmable Read Only Memory (PROM), magnetic memory, magnetic disk, optical disk, etc. In some embodiments, the storage 41 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. In other embodiments, the memory 41 may also be an external storage device of the computer device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the computer device 4. Of course, the memory 41 may also comprise both an internal memory unit of the computer device 4 and an external memory device. In this embodiment, the memory 41 is generally used to store an operating system and various application software installed on the computer device 4, such as computer readable instructions of a cache configuration method. Further, the memory 41 may be used to temporarily store various types of data that have been output or are to be output.
The processor 42 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other data processing chip in some embodiments. The processor 42 is typically used to control the overall operation of the computer device 4. In this embodiment, the processor 42 is configured to execute computer readable instructions stored in the memory 41 or process data, for example, computer readable instructions for executing the cache configuration method.
The network interface 43 may comprise a wireless network interface or a wired network interface, which network interface 43 is typically used for establishing a communication connection between the computer device 4 and other electronic devices.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
according to the cache configuration method provided by the embodiment of the application, the routing configuration of the target application is increased according to the attribute information, the target page information and the target skip page information input by the user in the information configuration page, and the component is introduced into the third party library, so that the cache function of the page of the target application can be realized by wrapping the target application and the page needing to be cached based on the introduced component, corresponding storage mode code writing is not needed for each page needing to be cached in the target application, the workload of page cache configuration is effectively reduced, and the processing efficiency of page cache configuration is improved. In addition, a page clearing rule is generated based on the target jump page information, and the clearing rule is configured in the target application, so that page cache optimization of the target application is realized based on personalized requirements of a user, occupation space of the page cache is reduced, and processing intelligence of the page cache is improved.
The present application also provides another embodiment, namely, a computer-readable storage medium storing computer-readable instructions executable by at least one processor to cause the at least one processor to perform the steps of the cache allocation method as described above.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
according to the cache configuration method provided by the embodiment of the application, the routing configuration of the target application is increased according to the attribute information, the target page information and the target skip page information input by the user in the information configuration page, and the component is introduced into the third party library, so that the cache function of the page of the target application can be realized by wrapping the target application and the page needing to be cached based on the introduced component, corresponding storage mode code writing is not needed for each page needing to be cached in the target application, the workload of page cache configuration is effectively reduced, and the processing efficiency of page cache configuration is improved. In addition, a page clearing rule is generated based on the target jump page information, and the clearing rule is configured in the target application, so that page cache optimization of the target application is realized based on personalized requirements of a user, occupation space of the page cache is reduced, and processing intelligence of the page cache is improved.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
It is apparent that the above-described embodiments are only some embodiments of the present application, but not all embodiments, and the preferred embodiments of the present application are shown in the drawings, which do not limit the scope of the patent claims. This application may be embodied in many different forms, but rather, embodiments are provided in order to provide a thorough and complete understanding of the present disclosure. Although the application has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described in the foregoing description, or equivalents may be substituted for elements thereof. All equivalent structures made by the content of the specification and the drawings of the application are directly or indirectly applied to other related technical fields, and are also within the scope of the application.

Claims (10)

1. The cache configuration method is characterized by comprising the following steps:
judging whether a cache configuration request for a target application triggered by a user is received or not;
if yes, a third party library is installed in a reaction project of the target application, and a preset information configuration page is displayed;
receiving attribute information input by the user on the information configuration page, target page information and target skip page information corresponding to the target page information;
adding a target attribute corresponding to the attribute information in the routing configuration of the target application based on the attribute information;
introducing a first target component and a second target component into the third party library;
performing package processing on the target application based on the first target component, and performing package processing on the target page information based on the second target component;
and generating a page clearing rule based on the target jump page information, and configuring the clearing rule in the target application.
2. The cache allocation method according to claim 1, further comprising, after the step of generating a page purge rule based on the target jump page information and configuring the purge rule in the target application:
When a target user accesses a first page of the target application, receiving a page jump request triggered by the target user in the first page;
recording a page path of a second page to which the first page is going;
loading a second page corresponding to the page path;
acquiring a specified attribute corresponding to the first page based on the clearing rule;
acquiring appointed skip page information corresponding to the second target component from the appointed attribute;
judging whether the page path is contained in the designated jump page information;
and if the page path is not included, calling a preset clearing method to perform clearing cache processing corresponding to the second page.
3. The cache configuration method according to claim 2, wherein the step of loading the second page corresponding to the page path specifically includes:
text data are obtained by text extraction of the second page, and whether sensitive words exist in the text data in the second page is detected;
if no sensitive word exists, image extraction is carried out on the second page to obtain image data, and whether illegal images exist in the image data in the second page is detected;
And if the illegal image does not exist, loading a second page corresponding to the page path.
4. The cache configuration method according to claim 3, wherein the step of performing text extraction on the second page to obtain text data and detecting whether there is a sensitive word in the text data in the second page specifically includes:
performing text extraction on the second page to obtain text data in the second page;
word segmentation processing is carried out on the text data to obtain a plurality of corresponding words;
acquiring a preset sensitive word set;
respectively carrying out matching processing on each word and the sensitive keywords in the sensitive word set, and judging whether specified words successfully matched with the sensitive keywords exist in all the words;
if the appointed word successfully matched exists, judging that the sensitive word exists in the text data in the second page, otherwise, judging that the sensitive word does not exist in the text data in the second page.
5. The cache allocation method according to claim 3, further comprising, after the step of performing text extraction on the second page to obtain text data and detecting whether there is a sensitive word in the text data in the second page:
If the text data in the second page contains sensitive words, loading the second page is limited;
acquiring preset page error information, and displaying the page error information in the first page;
generating processing alarm information corresponding to the sensitive word;
acquiring communication information of operation and maintenance personnel;
and pushing the processing alarm information to the communication personnel based on the communication information.
6. The cache configuration method according to claim 3, wherein the step of extracting the image of the second page to obtain image data and detecting whether there is a violation image in the image data of the second page specifically includes:
extracting an image of the second page to obtain image data in the second page;
calling a preset image detection model;
inputting the image into the image detection model, and detecting the violations of the image data through the image detection model to generate a violating detection result corresponding to the image data;
and identifying whether a violation image exists in the image data in the second page based on the violation detection result.
7. The cache allocation method according to claim 3, further comprising, after the step of extracting the image of the second page to obtain image data and detecting whether there is a violation image in the image data of the second page:
If the image data in the second page has illegal images, acquiring a preset image template;
acquiring the position information of the violation image in the second page;
based on the position information, performing coverage processing on the illegal images in the second page by using the image template to obtain a target image;
and loading the second page.
8. A cache allocation apparatus, comprising:
the first judging module is used for judging whether a buffer configuration request for a target application triggered by a user is received or not;
the installation module is used for installing a third party library in a reaction project of the target application if yes, and displaying a preset information configuration page;
the first receiving module is used for receiving attribute information, target page information and target skip page information corresponding to the target page information which are input by the user on the information configuration page;
an adding module, configured to add, based on the attribute information, a target attribute corresponding to the attribute information in a routing configuration of the target application;
the first processing module is used for introducing a first target component and a second target component into the third party library;
the second processing module is used for carrying out package processing on the target application based on the first target component and carrying out package processing on the target page information based on the second target component;
And the configuration module is used for generating a page clearing rule based on the target skip page information and configuring the clearing rule in the target application.
9. A computer device comprising a memory having stored therein computer readable instructions which when executed by a processor implement the steps of the cache allocation method of any of claims 1 to 7.
10. A computer readable storage medium having stored thereon computer readable instructions which when executed by a processor implement the steps of the cache allocation method according to any of claims 1 to 7.
CN202311038604.0A 2023-08-16 2023-08-16 Cache configuration method, device, computer equipment and storage medium Pending CN117056629A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311038604.0A CN117056629A (en) 2023-08-16 2023-08-16 Cache configuration method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311038604.0A CN117056629A (en) 2023-08-16 2023-08-16 Cache configuration method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117056629A true CN117056629A (en) 2023-11-14

Family

ID=88664049

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311038604.0A Pending CN117056629A (en) 2023-08-16 2023-08-16 Cache configuration method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117056629A (en)

Similar Documents

Publication Publication Date Title
CN116774973A (en) Data rendering method, device, computer equipment and storage medium
CN116956326A (en) Authority data processing method and device, computer equipment and storage medium
CN116450723A (en) Data extraction method, device, computer equipment and storage medium
CN117056629A (en) Cache configuration method, device, computer equipment and storage medium
CN111291644B (en) Method and apparatus for processing information
CN116701488A (en) Data processing method, device, computer equipment and storage medium
CN116643884A (en) Data computing method, device, equipment and storage medium based on rule engine
CN117278298A (en) Domain name detection method, device, equipment and storage medium based on artificial intelligence
CN116932090A (en) Tool pack loading method, device, computer equipment and storage medium
CN117407420A (en) Data construction method, device, computer equipment and storage medium
CN117389607A (en) Signboard configuration method and device, computer equipment and storage medium
CN116663003A (en) Attack detection method, attack detection device, computer equipment and storage medium
CN116775649A (en) Data classified storage method and device, computer equipment and storage medium
CN116737437A (en) Data analysis method, device, computer equipment and storage medium
CN117235787A (en) Page interaction method, device, equipment and storage medium thereof
CN116910095A (en) Buried point processing method, buried point processing device, computer equipment and storage medium
CN117290597A (en) Information pushing method, device, equipment and storage medium based on artificial intelligence
CN116932486A (en) File generation method, device, computer equipment and storage medium
CN117278623A (en) Method and device for processing request data, computer equipment and storage medium
CN117632331A (en) Data display method, device, computer equipment and storage medium
CN116738948A (en) Data processing method, device, computer equipment and storage medium
CN116700829A (en) Resource data processing method, device, computer equipment and storage medium
CN115546356A (en) Animation generation method and device, computer equipment and storage medium
CN117217684A (en) Index data processing method and device, computer equipment and storage medium
CN115080045A (en) Link generation method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination