CN116775186A - Page data processing method and device, computer equipment and storage medium - Google Patents

Page data processing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN116775186A
CN116775186A CN202310695643.1A CN202310695643A CN116775186A CN 116775186 A CN116775186 A CN 116775186A CN 202310695643 A CN202310695643 A CN 202310695643A CN 116775186 A CN116775186 A CN 116775186A
Authority
CN
China
Prior art keywords
data
application program
cache
page
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310695643.1A
Other languages
Chinese (zh)
Inventor
庄志辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Property and Casualty Insurance Company of China Ltd
Original Assignee
Ping An Property and Casualty Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Property and Casualty Insurance Company of China Ltd filed Critical Ping An Property and Casualty Insurance Company of China Ltd
Priority to CN202310695643.1A priority Critical patent/CN116775186A/en
Publication of CN116775186A publication Critical patent/CN116775186A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/215Improving data quality; Data cleansing, e.g. de-duplication, removing invalid entries or correcting typographical errors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/25Integrating or interfacing systems involving database management systems
    • G06F16/252Integrating or interfacing systems involving database management systems between a Database Management System and a front-end application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

The embodiment of the application belongs to the field of big data and the field of financial science and technology, and relates to a page data processing method, which comprises the following steps: if a click operation, triggered by a user, on target data in a first-level page of an application program is received, acquiring a detail link corresponding to the target data, and adding a cache parameter in the detail link; calling the back-end interface service when loading the detail page corresponding to the detail link; inquiring front-end routes contained in the cache parameters based on the back-end interface service to obtain cache list data corresponding to the front-end routes; and carrying out corresponding processing on the cache list data based on the operation information contained in the cache parameters. The application also provides a page data processing device, computer equipment and a storage medium. In addition, the application also relates to a blockchain technology, and the cache list data can be stored in the blockchain. The application can be applied to the scene of financial application data display in the financial field, and effectively improves the data accuracy of page display of an application program.

Description

Page data processing method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of big data technologies and financial technologies, and in particular, to a method and apparatus for processing page data, a computer device, and a storage medium.
Background
In applications developed by the finance and technology company, the display of business data of a store, such as data of a monthly production value on the store, including data of premium, push-to-repair, order, etc., is usually involved, and the calculation of such data is usually performed by timing calculation of big data and then is presented to a business system through an interface. The presentation of such data is usually a pre-presentation of a first level page of an application program, i.e. a presentation of the first page of the application program, and clicking on the data penetrates the data to the corresponding data detail page. Because the data of the first page are aggregated into the data display of different modules, and the data correspond to the cache data of different modules, the cache synchronization problem exists, namely the problem that the data of different modules of the first page are possibly different from the data display of the data detail page occurs, the data accuracy of the page display of the application program cannot be ensured, and the user of the application program is also doubtful and puzzled about the data accuracy, and the use experience of the user is poor.
Disclosure of Invention
The embodiment of the application aims to provide a page data processing method, a device, computer equipment and a storage medium, which are used for solving the technical problems that the cache synchronization problem exists in the page display of the existing application program, the data accuracy of the page display of the application program cannot be ensured, and the user of the application program has doubts and troubles on the data accuracy and the use experience of the user is poor.
In order to solve the above technical problems, an embodiment of the present application provides a page data processing method, which adopts the following technical schemes:
judging whether a click operation, triggered by a user, on target data in a primary page of an application program is received or not;
if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link;
calling a preset back-end interface service when loading a detail page corresponding to the detail link;
inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes;
and correspondingly processing the cache list data based on the operation information contained in the cache parameters.
Further, the step of querying the front-end route included in the cache parameter based on the back-end port to obtain cache list data corresponding to the front-end route specifically includes:
acquiring the front-end route from the cache parameter;
acquiring a preset configuration table;
and inquiring the configuration table according to the front-end route through the back-end interface service, and inquiring the cache list data corresponding to the front-end route from the configuration table.
Further, the page data processing method further includes:
acquiring the manuscript data size of the application program;
judging whether the size of the manuscript data is larger than a preset threshold value or not;
if the application program is larger than the preset threshold value, acquiring the use information of the user corresponding to the application program;
judging whether the application program is a common application of the user based on the use information;
if the application is the common application of the user, judging whether a cache cleaning processing operation triggered by the user and corresponding to the application program exists or not based on the use information;
if the cache cleaning processing operation exists, acquiring the times of the cache cleaning processing operation, and judging whether the times are larger than a preset times threshold;
If the number of times is larger than the threshold value, determining data to be cleaned from the manuscript data of the application program;
and cleaning the data to be cleaned.
Further, the step of obtaining the document data size of the application program specifically includes:
acquiring the application size of the application program;
acquiring an occupied storage space corresponding to the application program;
and generating the manuscript data size of the application program based on the occupied storage space and the application size.
Further, the step of determining the data to be cleaned from the manuscript data of the application program specifically includes:
acquiring a data type of cleaning data corresponding to the cache cleaning process;
acquiring specified data corresponding to the data type from manuscript data of the application program;
and taking the specified data as the data to be cleaned.
Further, after the step of determining whether the application program is a commonly used application of the user based on the usage information, the method further includes:
if the application program is not the common application of the user, acquiring the ratio of occupied storage space corresponding to the application program;
Judging whether the ratio of the occupied storage space is larger than a preset ratio threshold value or not;
if yes, the manuscript data of the application program is cleared.
Further, after the step of determining whether there is the cache cleaning processing operation triggered by the user and corresponding to the application program based on the usage information, the method further includes:
if the cache cleaning processing operation triggered by the user and corresponding to the application program does not exist, acquiring the application name of the application program;
generating capacity early warning information corresponding to the application program based on the application name and the manuscript data size;
and displaying the capacity early warning information.
In order to solve the above technical problems, the embodiment of the present application further provides a page data processing device, which adopts the following technical schemes:
the first judging module is used for judging whether clicking operation of target data in a primary page of the application program triggered by a user is received or not;
the first acquisition module is used for acquiring detail links corresponding to the target data if yes, and adding preset cache parameters into the detail links;
the calling module is used for calling a preset back-end interface service when loading the detail page corresponding to the detail link;
The query module is used for carrying out query processing on the front-end route contained in the cache parameter based on the back-end interface, so as to obtain cache list data corresponding to the front-end route;
and the processing module is used for correspondingly processing the cache list data based on the operation information contained in the cache parameters.
In order to solve the above technical problems, the embodiment of the present application further provides a computer device, which adopts the following technical schemes:
judging whether a click operation, triggered by a user, on target data in a primary page of an application program is received or not;
if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link;
calling a preset back-end interface service when loading a detail page corresponding to the detail link;
inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes;
and correspondingly processing the cache list data based on the operation information contained in the cache parameters.
In order to solve the above technical problems, an embodiment of the present application further provides a computer readable storage medium, which adopts the following technical schemes:
Judging whether a click operation, triggered by a user, on target data in a primary page of an application program is received or not;
if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link;
calling a preset back-end interface service when loading a detail page corresponding to the detail link;
inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes;
and correspondingly processing the cache list data based on the operation information contained in the cache parameters.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
in the embodiment of the application, whether the click operation of target data in a primary page of an application program triggered by a user is received is firstly judged; if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link; then, when loading a detail page corresponding to the detail link, calling a preset back-end interface service; subsequently, inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes; and finally, carrying out corresponding processing on the cache list data based on the operation information contained in the cache parameters. According to the embodiment of the application, when a user clicks target data in a first-level page of an application program to perform data penetration, preset cache parameters are added in detail links of the target data, and then, according to front-end routing of the front page data, the front page cache to be deleted is queried by utilizing a rear-end port and is correspondingly processed, so that the front page data cache is reloaded when the front page is returned, the consistency of the front page data display and the detail page data display is maintained, the page display data accuracy of the application program is effectively improved, the problem that the detail page data display of the data penetration and the front page data display are inconsistent is solved, and the use experience of the user is facilitated to be improved.
Drawings
In order to more clearly illustrate the solution of the present application, a brief description will be given below of the drawings required for the description of the embodiments of the present application, it being apparent that the drawings in the following description are some embodiments of the present application, and that other drawings may be obtained from these drawings without the exercise of inventive effort for a person of ordinary skill in the art.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a page data processing method according to the present application;
FIG. 3 is a schematic diagram of one embodiment of a page data processing apparatus in accordance with the present application;
FIG. 4 is a schematic structural diagram of one embodiment of a computer device in accordance with the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the applications herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion. The terms first, second and the like in the description and in the claims or in the above-described figures, are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In order to make the person skilled in the art better understand the solution of the present application, the technical solution of the embodiment of the present application will be clearly and completely described below with reference to the accompanying drawings.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as a web browser application, a shopping class application, a search class application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103.
It should be noted that, the page data processing method provided by the embodiment of the present application is generally executed by a server/terminal device, and accordingly, the page data processing apparatus is generally disposed in the server/terminal device.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow chart of one embodiment of a page data processing method in accordance with the present application is shown. The order of the steps in the flowchart may be changed and some steps may be omitted according to various needs. The page data processing method provided by the embodiment of the application can be applied to any scene needing page data display, and can be applied to products of the scenes, for example, application page data display of insurance application in the field of financial insurance. The page data processing method comprises the following steps:
Step S201, determining whether a click operation for target data in a primary page of an application program triggered by a user is received.
In this embodiment, the electronic device (for example, the server/terminal device shown in fig. 1) on which the page data processing method operates may acquire, by using a wired connection manner or a wireless connection manner, a click operation triggered by a user on target data in a primary page of an application program. It should be noted that the wireless connection may include, but is not limited to, 3G/4G/5G connection, wiFi connection, bluetooth connection, wiMAX connection, zigbee connection, UWB (ultra wideband) connection, and other now known or later developed wireless connection. Wherein, the first level page of the application program refers to the first page of the application program, such as a workbench, my etc. data presentation page. When a user accesses a first-level page of an application program, the read data are cache data integrated by different module data, such as a push-repair production value (ring ratio), an order quantity (ring ratio) and a premium (ring ratio) from the current month to the previous day of a home page display store, and the smoothness of home page access is achieved by reading the cache data integrated by different modules. In addition, the target data is the data corresponding to the corresponding data details which need to be clicked and referred by the user.
Step S202, if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter in the detail link.
In this embodiment, after the user triggers the clicking operation on the target data in the primary page of the application program, the data penetrates to the corresponding data detail page, and the link of the data detail page is the detail link. The cache parameters include front-end routing and operation information. In addition, since the cache of the first-level page is cache data obtained by data aggregation of each module, the data of the detail page may also have own cache of each module, so that there may be a possibility of data difference, that is, the data display seen by the first page and the data of the detail page are not the same.
Step S203, when loading the detail page corresponding to the detail link, calling a preset back-end interface service.
In this embodiment, when the user clicks the target data to perform data penetration, a cache parameter (for example, & redis=delete) including the front-end route and the operation information is added to the detail link accessed to the front-end, so that the detail link carries the redis identifier, that is, when the front-end H5 page is loaded, the interface service of the back-end is asynchronously invoked, and the front-end route and the corresponding redis operation are imported. Wherein, the call of the back-end interface service is asynchronous response, and the interface is not required to return a result
Step S204, performing query processing on the front-end route included in the cache parameter based on the back-end port, to obtain cache list data corresponding to the front-end route.
In this embodiment, the foregoing specific implementation process of querying the front-end route included in the cache parameter based on the back-end port to obtain the cache list data corresponding to the front-end route will be described in further detail in the following specific embodiments, which are not described herein.
Step S205, performing corresponding processing on the cache list data based on the operation information included in the cache parameter.
In this embodiment, the operation information is specifically a delete operation, that is, when a user accesses a detail page of the target data, according to a front-end route of the home page data, a back-end port is used to query a cache of the home page to be deleted in the configuration table and perform corresponding processing, so that when the home page is returned, the data cache of the home page is reloaded, thereby ensuring consistency of data display of the home page and data display of the detail page.
Firstly, judging whether a click operation of target data in a primary page of an application program triggered by a user is received or not; if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link; then, when loading a detail page corresponding to the detail link, calling a preset back-end interface service; subsequently, inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes; and finally, carrying out corresponding processing on the cache list data based on the operation information contained in the cache parameters. According to the application, when a user clicks target data in a first-level page of an application program to perform data penetration, a preset cache parameter is added in a detail link of the target data, and then, according to the front-end route of the front-page data, the front-page cache to be deleted is queried by utilizing a rear-end port, and corresponding processing is performed, so that the front-page data cache is reloaded when the front-page is returned, thereby, the consistency of the data display of the front-page and the data display of the detail page is maintained, the data accuracy of the page display of the application program is effectively improved, the problem that the data display of the detail page through the data penetration and the data display of the front-page are inconsistent is solved, and the use experience of the user is facilitated to be improved.
In some alternative implementations, step S204 includes the steps of:
and acquiring the front-end route from the cache parameter.
In this embodiment, the cache parameter includes a front-end route and operation information, and the front-end route may be extracted from the cache parameter.
And acquiring a preset configuration table.
In this embodiment, the configuration table is a data table created in advance and storing a plurality of routes and a cache key list corresponding to the plurality of routes.
And inquiring the configuration table according to the front-end route through the back-end interface service, and inquiring the cache list data corresponding to the front-end route from the configuration table.
In this embodiment, the back-end port performs the query of the configuration table according to the front-end route included in the cache parameter, that is, caches the key list according to the route- >, and then performs the corresponding operation (including deletion, renewal, and new creation, for example) on the cached key list corresponding to the first-level page according to the redis operation included in the cache parameter. Specifically, when a user clicks target data in a first-level page to perform data penetration, a cache redis operation for a back-end asynchronization is identified according to a detail page of a front end, so that a user can read only the cache when accessing the first-level page of an application program, and after clicking the detail page, the user can find a corresponding data cache according to a front-end route, and then delete the data cache, so that the cache refreshing or renewing or deleting operation is achieved.
The front-end route is obtained from the cache parameters; then acquiring a preset configuration table; and subsequently, inquiring the configuration table according to the front-end route through the back-end interface service, and inquiring the cache list data corresponding to the front-end route from the configuration table. After the front-end route is obtained from the cache parameters, the application can realize the quick and accurate inquiry of the cache list data corresponding to the front-end route from the configuration table related to the cache of the application program by calling the back-end interface service.
In some optional implementations of this embodiment, the electronic device may further perform the following steps:
and acquiring the manuscript data size of the application program.
In this embodiment, the above specific implementation process of obtaining the document data size of the application program will be described in further detail in the following specific embodiments, which will not be described herein.
And judging whether the size of the manuscript data is larger than a preset threshold value.
In this embodiment, the value of the preset threshold is not limited specifically, and may be set according to the actual use requirement. If the size of the manuscript data of the application program is too large, for example, greater than a preset threshold, the size of the manuscript data of the application program is larger than the preset threshold, which indicates that the occupied space of the application program in the electronic device is larger, and thus the normal operation of the application program is affected, for example, the application program is blocked.
And if the application program is larger than the preset threshold value, acquiring the use information of the user corresponding to the application program.
In this embodiment, the embedded point processing may be performed on the application program in advance to acquire the use information of the user for the application program through the embedded point processing. Wherein the usage information at least comprises the usage frequency of the application program used by the user, the cleaning processing information of the application program, such as cleaning content, cleaning frequency, and the like
And judging whether the application program is a common application of the user based on the use information.
In this embodiment, if the number of times the user uses the application program in the preset period of time is greater than the preset number of times threshold, the application program is determined to be a commonly used application of the user. The value of the preset frequency threshold is not particularly limited, and can be set according to actual use requirements.
And if the cache cleaning processing operation is the common application of the user, judging whether the cache cleaning processing operation corresponding to the application program triggered by the user exists or not based on the use information.
If the cache cleaning processing operation exists, acquiring the times of the cache cleaning processing operation, and judging whether the times are larger than a preset times threshold.
In this embodiment, the value of the frequency threshold is not particularly limited, and may be set according to actual use requirements.
And if the number of times is larger than the threshold value, determining the data to be cleaned from the manuscript data of the application program.
In this embodiment, the specific implementation process of determining the data to be cleaned from the document data of the application program is described in further detail in the following specific embodiments, which will not be described herein.
And cleaning the data to be cleaned.
In the present embodiment, the processing time of the cleaning processing of the data to be cleaned is not limited, and for example, an immediate cleaning manner, a manner of processing in accordance with a prescribed cleaning time, or the like may be employed.
The application intelligently clears the application program by the size of the manuscript data of the application program, the use information of the user and the cache clearing processing operation of the application program triggered by the user, ensures the simplicity of the application program, ensures the normal operation of the electronic equipment and the application program, does not need the user to manually delete the cache data in the application program, and improves the use experience of the user.
In some optional implementations, the acquiring the document data size of the application program includes the steps of:
and acquiring the application size of the application program.
In this embodiment, the application size of the application program may be obtained from the storage space information by querying the storage space information of the electronic device.
And acquiring occupied storage space corresponding to the application program.
In this embodiment, the occupying of the storage space refers to that the application program occupies the storage space of the electronic device.
And generating the manuscript data size of the application program based on the occupied storage space and the application size.
In this embodiment, the difference between the occupied storage space and the application size may be calculated, and the difference may be used as the document data size of the application program. .
The application size of the application program is obtained; then acquiring an occupied storage space corresponding to the application program; and generating the manuscript data size of the application program based on the occupied storage space and the application size, so that the manuscript data size of the application program can be quickly and accurately generated through calculation processing based on the application size of the application program and the manuscript data size of the application program.
In some optional implementations, the determining data to be cleaned from the manuscript data of the application program includes the following steps:
and acquiring the data type of the cleaning data corresponding to the cache cleaning process.
In this embodiment, the types of cache data that can be cleaned up generally may include data such as pictures, video, passwords, etc. The data type of the cleaning data corresponding to the cache cleaning process may refer to the type of the cache data deleted by the user according to personal preference.
And acquiring the specified data corresponding to the data type from the manuscript data of the application program.
In this embodiment, when determining the data type of the cleaning data corresponding to the cache cleaning process, the specified data may be obtained by screening data matching the data type from the document data of the application program.
And taking the specified data as the data to be cleaned.
The application obtains the data type of the cleaning data corresponding to the cache cleaning processing; and then acquiring specified data corresponding to the data type from the manuscript data of the application program, and further taking the specified data as the data to be cleaned. According to the method and the device, the data type of the cache data deleted by the user according to personal preference is intelligently obtained, so that the designated data corresponding to the data type is automatically and intelligently deleted only when the application program is required to be cleared later, and the deleted cache data is attached to personal use habits of the user, so that the clearing intelligence of the application program is effectively improved, and the use experience of the user is improved.
In some optional implementations of this embodiment, after the step of determining whether the application program is a commonly used application of the user based on the usage information, the electronic device may further perform the following steps:
and if the application program is not the common application of the user, acquiring the ratio of the occupied storage space corresponding to the application program.
In this embodiment, the ratio of the occupied storage space of the application program to the total storage capacity of the electronic device may be calculated to obtain the ratio of the occupied storage space.
And judging whether the ratio of the occupied storage space is larger than a preset ratio threshold value or not.
In this embodiment, the value of the ratio threshold is not specifically limited, and may be set according to actual use requirements. If the ratio of the occupied storage space of the application program is too large, for example, greater than the ratio threshold, the occupied space of the application program in the electronic device is indicated to be large, so that the normal operation of the electronic device and the application program is affected, for example, the operation of the application program is blocked.
If yes, the manuscript data of the application program is cleared.
In the present embodiment, the processing time of the cleaning processing of the data to be cleaned is not limited, and for example, an immediate cleaning manner, a manner of processing in accordance with a prescribed cleaning time, or the like may be employed.
If the application program is detected not to be the common application of the user, acquiring the ratio of occupied storage space corresponding to the application program; then judging whether the ratio of the occupied storage space is larger than a preset ratio threshold value or not; if yes, the manuscript data of the application program is cleared. The application intelligently clears the application program by comparing and analyzing the ratio of the occupied storage space of the application program with the preset ratio threshold, ensures the simplicity of the application program, ensures the normal operation of the electronic equipment and the application program, does not need a user to manually delete the cache data in the application program, and improves the use experience of the user.
In some optional implementations of this embodiment, after the step of determining, based on the usage information, whether there is a cache cleaning processing operation triggered by the user and corresponding to the application program, the electronic device may further execute the following steps:
and if the cache cleaning processing operation triggered by the user and corresponding to the application program does not exist, acquiring the application name of the application program.
In this embodiment, if the cache cleaning processing operation triggered by the user and corresponding to the application program is not detected, it indicates that the application program belongs to an extremely important application, or possibly belongs to a non-important application, and then the cache cleaning processing is not directly and automatically performed on the application program, but corresponding early warning information is intelligently generated to remind the user to delete or not delete according to own wish.
And generating capacity early warning information corresponding to the application program based on the application name and the manuscript data size.
In this embodiment, an early warning information template is obtained in advance, and then the application name and the manuscript data size are filled into the early warning information template to generate capacity early warning information corresponding to the application program. The early warning information template can be pre-written and generated according to actual service use requirements.
And displaying the capacity early warning information.
In this embodiment, the capacity early warning information may be displayed in a web page display or information display manner.
If the cache cleaning processing operation corresponding to the application program triggered by the user is detected not to exist, acquiring the application name of the application program; then, based on the application name and the manuscript data size, generating capacity early warning information corresponding to the application program; and displaying the capacity early warning information subsequently. When detecting that the situation that the cache cleaning processing operation corresponding to the application program triggered by the user does not exist, the cache cleaning processing of the application program is not directly carried out in the follow-up process, but corresponding early warning information is intelligently generated and displayed to the user based on the application name and the manuscript data size of the application program, so that the user is reminded to delete or not delete according to own wish, and the use experience of the user is effectively improved.
It should be emphasized that, to further ensure the privacy and security of the cache list data, the cache list data may also be stored in a blockchain node.
The blockchain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanism, encryption algorithm and the like. The Blockchain (Blockchain), which is essentially a decentralised database, is a string of data blocks that are generated by cryptographic means in association, each data block containing a batch of information of network transactions for verifying the validity of the information (anti-counterfeiting) and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, an application services layer, and the like.
The embodiment of the application can acquire and process the related data based on the artificial intelligence technology. Among these, artificial intelligence (Artificial Intelligence, AI) is the theory, method, technique and application system that uses a digital computer or a digital computer-controlled machine to simulate, extend and extend human intelligence, sense the environment, acquire knowledge and use knowledge to obtain optimal results.
Artificial intelligence infrastructure technologies generally include technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by computer readable instructions stored in a computer readable storage medium that, when executed, may comprise the steps of the embodiments of the methods described above. The storage medium may be a nonvolatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
With further reference to fig. 3, as an implementation of the method shown in fig. 2, the present application provides an embodiment of a page data processing apparatus, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus is specifically applicable to various electronic devices.
As shown in fig. 3, the page data processing apparatus 300 according to the present embodiment includes: the system comprises a first judging module 301, a first acquiring module 302, a calling module 303, a querying module 304 and a processing module 305. Wherein:
a first determining module 301, configured to determine whether a click operation triggered by a user on target data in a primary page of an application program is received;
the first obtaining module 302 is configured to obtain a detail link corresponding to the target data if yes, and add a preset cache parameter in the detail link;
a calling module 303, configured to call a preset back-end interface service when loading a detail page corresponding to the detail link;
the query module 304 is configured to perform query processing on a front-end route included in the cache parameter based on the back-end port, to obtain cache list data corresponding to the front-end route;
and the processing module 305 is configured to perform corresponding processing on the cache list data based on the operation information included in the cache parameter.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some alternative implementations of the present embodiment, the query module 304 includes:
the first acquisition submodule is used for acquiring the front-end route from the cache parameter;
the second acquisition sub-module is used for acquiring a preset configuration table;
and the query sub-module is used for carrying out query processing on the configuration table according to the front-end route through the back-end interface service, and querying the cache list data corresponding to the front-end route from the configuration table.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the page data processing apparatus further includes:
the second acquisition module is used for acquiring the manuscript data size of the application program;
the second judging module is used for judging whether the size of the manuscript data is larger than a preset threshold value or not;
a third obtaining module, configured to obtain usage information of the user corresponding to the application program if the usage information is greater than the preset threshold;
The third judging module is used for judging whether the application program is a common application of the user or not based on the use information;
a fourth judging module, configured to judge, if the cache cleaning processing operation triggered by the user and corresponding to the application program exists, based on the usage information, if the cache cleaning processing operation is a common application of the user;
a fifth judging module, configured to obtain the number of times of the cache cleaning processing operation if the cache cleaning processing operation exists, and judge whether the number of times is greater than a preset number of times threshold;
the determining module is used for determining data to be cleaned from manuscript data of the application program if the number of times of the data to be cleaned is larger than the number of times threshold;
and the first clearing module is used for clearing the data to be cleared.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the second obtaining module includes:
a third obtaining sub-module, configured to obtain an application size of the application program;
a fourth obtaining sub-module, configured to obtain an occupied storage space corresponding to the application program;
And the generation sub-module is used for generating the manuscript data size of the application program based on the occupied storage space and the application size.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the determining module includes:
a fifth obtaining sub-module, configured to obtain a data type of the cleaning data corresponding to the cache cleaning process;
a sixth obtaining sub-module, configured to obtain specified data corresponding to the data type from document data of the application program;
and the determining submodule is used for taking the specified data as the data to be cleaned.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the page data processing apparatus further includes:
a fourth obtaining module, configured to obtain a ratio of occupied storage space corresponding to the application program if the application program is not a common application of the user;
A sixth judging module, configured to judge whether the ratio of the occupied storage space is greater than a preset ratio threshold;
and the second clearing module is used for clearing the manuscript data of the application program if yes.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In some optional implementations of this embodiment, the page data processing apparatus further includes:
a fifth obtaining module, configured to obtain an application name of the application program if there is no cache cleaning processing operation triggered by the user and corresponding to the application program;
the generation module is used for generating capacity early warning information corresponding to the application program based on the application name and the manuscript data size;
and the display module is used for displaying the capacity early warning information.
In this embodiment, the operations performed by the modules or units respectively correspond to the steps of the page data processing method in the foregoing embodiment one by one, which is not described herein again.
In order to solve the technical problems, the embodiment of the application also provides computer equipment. Referring specifically to fig. 4, fig. 4 is a basic structural block diagram of a computer device according to the present embodiment.
The computer device 4 comprises a memory 41, a processor 42, a network interface 43 communicatively connected to each other via a system bus. It should be noted that only computer device 4 having components 41-43 is shown in the figures, but it should be understood that not all of the illustrated components are required to be implemented and that more or fewer components may be implemented instead. It will be appreciated by those skilled in the art that the computer device herein is a device capable of automatically performing numerical calculations and/or information processing in accordance with predetermined or stored instructions, the hardware of which includes, but is not limited to, microprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASICs), programmable gate arrays (fields-Programmable Gate Array, FPGAs), digital processors (Digital Signal Processor, DSPs), embedded devices, etc.
The computer equipment can be a desktop computer, a notebook computer, a palm computer, a cloud server and other computing equipment. The computer equipment can perform man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
The memory 41 includes at least one type of readable storage medium including flash memory, hard disk, multimedia card, card memory (e.g., SD or DX memory, etc.), random Access Memory (RAM), static Random Access Memory (SRAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), programmable Read Only Memory (PROM), magnetic memory, magnetic disk, optical disk, etc. In some embodiments, the storage 41 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. In other embodiments, the memory 41 may also be an external storage device of the computer device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like, which are provided on the computer device 4. Of course, the memory 41 may also comprise both an internal memory unit of the computer device 4 and an external memory device. In this embodiment, the memory 41 is typically used to store an operating system and various application software installed on the computer device 4, such as computer readable instructions of a page data processing method. Further, the memory 41 may be used to temporarily store various types of data that have been output or are to be output.
The processor 42 may be a central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor, or other data processing chip in some embodiments. The processor 42 is typically used to control the overall operation of the computer device 4. In this embodiment, the processor 42 is configured to execute computer readable instructions stored in the memory 41 or process data, such as computer readable instructions for executing the page data processing method.
The network interface 43 may comprise a wireless network interface or a wired network interface, which network interface 43 is typically used for establishing a communication connection between the computer device 4 and other electronic devices.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
in the embodiment of the application, whether the click operation of target data in a primary page of an application program triggered by a user is received is firstly judged; if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link; then, when loading a detail page corresponding to the detail link, calling a preset back-end interface service; subsequently, inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes; and finally, carrying out corresponding processing on the cache list data based on the operation information contained in the cache parameters. According to the embodiment of the application, when a user clicks target data in a first-level page of an application program to perform data penetration, preset cache parameters are added in detail links of the target data, and then, according to front-end routing of the front page data, the front page cache to be deleted is queried by utilizing a rear-end port and is correspondingly processed, so that the front page data cache is reloaded when the front page is returned, the consistency of the front page data display and the detail page data display is maintained, the page display data accuracy of the application program is effectively improved, the problem that the detail page data display of the data penetration and the front page data display are inconsistent is solved, and the use experience of the user is facilitated to be improved.
The present application also provides another embodiment, namely, a computer-readable storage medium storing computer-readable instructions executable by at least one processor to cause the at least one processor to perform the steps of the page data processing method as described above.
Compared with the prior art, the embodiment of the application has the following main beneficial effects:
in the embodiment of the application, whether the click operation of target data in a primary page of an application program triggered by a user is received is firstly judged; if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link; then, when loading a detail page corresponding to the detail link, calling a preset back-end interface service; subsequently, inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes; and finally, carrying out corresponding processing on the cache list data based on the operation information contained in the cache parameters. According to the embodiment of the application, when a user clicks target data in a first-level page of an application program to perform data penetration, preset cache parameters are added in detail links of the target data, and then, according to front-end routing of the front page data, the front page cache to be deleted is queried by utilizing a rear-end port and is correspondingly processed, so that the front page data cache is reloaded when the front page is returned, the consistency of the front page data display and the detail page data display is maintained, the page display data accuracy of the application program is effectively improved, the problem that the detail page data display of the data penetration and the front page data display are inconsistent is solved, and the use experience of the user is facilitated to be improved.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present application.
It is apparent that the above-described embodiments are only some embodiments of the present application, but not all embodiments, and the preferred embodiments of the present application are shown in the drawings, which do not limit the scope of the patent claims. This application may be embodied in many different forms, but rather, embodiments are provided in order to provide a thorough and complete understanding of the present disclosure. Although the application has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described in the foregoing description, or equivalents may be substituted for elements thereof. All equivalent structures made by the content of the specification and the drawings of the application are directly or indirectly applied to other related technical fields, and are also within the scope of the application.

Claims (10)

1. A method for processing page data, comprising the steps of:
judging whether a click operation, triggered by a user, on target data in a primary page of an application program is received or not;
if yes, acquiring a detail link corresponding to the target data, and adding a preset cache parameter into the detail link;
calling a preset back-end interface service when loading a detail page corresponding to the detail link;
inquiring front-end routes contained in the cache parameters based on the back-end port to obtain cache list data corresponding to the front-end routes;
and correspondingly processing the cache list data based on the operation information contained in the cache parameters.
2. The method for processing page data according to claim 1, wherein the step of performing query processing on the front-end route included in the cache parameter based on the back-end port to obtain cache list data corresponding to the front-end route specifically includes:
acquiring the front-end route from the cache parameter;
acquiring a preset configuration table;
and inquiring the configuration table according to the front-end route through the back-end interface service, and inquiring the cache list data corresponding to the front-end route from the configuration table.
3. The page data processing method according to claim 1, characterized in that the page data processing method further comprises:
acquiring the manuscript data size of the application program;
judging whether the size of the manuscript data is larger than a preset threshold value or not;
if the application program is larger than the preset threshold value, acquiring the use information of the user corresponding to the application program;
judging whether the application program is a common application of the user based on the use information;
if the application is the common application of the user, judging whether a cache cleaning processing operation triggered by the user and corresponding to the application program exists or not based on the use information;
if the cache cleaning processing operation exists, acquiring the times of the cache cleaning processing operation, and judging whether the times are larger than a preset times threshold;
if the number of times is larger than the threshold value, determining data to be cleaned from the manuscript data of the application program;
and cleaning the data to be cleaned.
4. The page data processing method as claimed in claim 3, wherein the step of obtaining the document data size of the application program specifically comprises:
acquiring the application size of the application program;
Acquiring an occupied storage space corresponding to the application program;
and generating the manuscript data size of the application program based on the occupied storage space and the application size.
5. The method for processing page data according to claim 3, wherein the step of determining data to be cleaned from document data of the application program specifically comprises:
acquiring a data type of cleaning data corresponding to the cache cleaning process;
acquiring specified data corresponding to the data type from manuscript data of the application program;
and taking the specified data as the data to be cleaned.
6. A page data processing method according to claim 3, further comprising, after the step of determining whether the application program is a commonly used application of the user based on the usage information:
if the application program is not the common application of the user, acquiring the ratio of occupied storage space corresponding to the application program;
judging whether the ratio of the occupied storage space is larger than a preset ratio threshold value or not;
if yes, the manuscript data of the application program is cleared.
7. A page data processing method according to claim 3, further comprising, after said step of determining whether there is said user-triggered cache cleaning processing operation corresponding to said application program based on said usage information:
If the cache cleaning processing operation triggered by the user and corresponding to the application program does not exist, acquiring the application name of the application program;
generating capacity early warning information corresponding to the application program based on the application name and the manuscript data size;
and displaying the capacity early warning information.
8. A page data processing apparatus, comprising:
the first judging module is used for judging whether clicking operation of target data in a primary page of the application program triggered by a user is received or not;
the first acquisition module is used for acquiring detail links corresponding to the target data if yes, and adding preset cache parameters into the detail links;
the calling module is used for calling a preset back-end interface service when loading the detail page corresponding to the detail link;
the query module is used for carrying out query processing on the front-end route contained in the cache parameter based on the back-end interface, so as to obtain cache list data corresponding to the front-end route;
and the processing module is used for correspondingly processing the cache list data based on the operation information contained in the cache parameters.
9. A computer device comprising a memory having stored therein computer readable instructions which when executed by a processor implement the steps of the page data processing method of any of claims 1 to 7.
10. A computer readable storage medium having stored thereon computer readable instructions which when executed by a processor implement the steps of the page data processing method of any of claims 1 to 7.
CN202310695643.1A 2023-06-12 2023-06-12 Page data processing method and device, computer equipment and storage medium Pending CN116775186A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310695643.1A CN116775186A (en) 2023-06-12 2023-06-12 Page data processing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310695643.1A CN116775186A (en) 2023-06-12 2023-06-12 Page data processing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116775186A true CN116775186A (en) 2023-09-19

Family

ID=87992352

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310695643.1A Pending CN116775186A (en) 2023-06-12 2023-06-12 Page data processing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116775186A (en)

Similar Documents

Publication Publication Date Title
EP3244312B1 (en) A personal digital assistant
WO2018177251A1 (en) Application program processing method, computer device and storage medium
CN109829077B (en) Page display method, device and equipment
CN113220657B (en) Data processing method and device and computer equipment
CN112182004B (en) Method, device, computer equipment and storage medium for checking data in real time
CN110020273B (en) Method, device and system for generating thermodynamic diagram
CN113010542B (en) Service data processing method, device, computer equipment and storage medium
CN116956326A (en) Authority data processing method and device, computer equipment and storage medium
CN116775186A (en) Page data processing method and device, computer equipment and storage medium
CN113515713B (en) Webpage caching strategy generation method and device and webpage caching method and device
CN116796093A (en) Interface data setting method and device, computer equipment and storage medium
CN116821493A (en) Message pushing method, device, computer equipment and storage medium
CN116795882A (en) Data acquisition method, device, computer equipment and storage medium
CN116932486A (en) File generation method, device, computer equipment and storage medium
CN115186196A (en) Content recommendation method and device, computer equipment and storage medium
CN116842011A (en) Blood relationship analysis method, device, computer equipment and storage medium
CN116661763A (en) Front-end and back-end development management method and device, computer equipment and storage medium
CN116820443A (en) Data analysis method, device, computer equipment and storage medium
CN117251468A (en) Query processing method, device, computer equipment and storage medium
CN117076775A (en) Information data processing method, information data processing device, computer equipment and storage medium
CN116775649A (en) Data classified storage method and device, computer equipment and storage medium
CN117421207A (en) Intelligent evaluation influence point test method, intelligent evaluation influence point test device, computer equipment and storage medium
CN117170547A (en) Service platform processing method, device, equipment and storage medium thereof
CN116719854A (en) Data comparison method, device, computer equipment and storage medium
CN116775187A (en) Data display method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination