CN111488370B - List paging quick response system and method - Google Patents

List paging quick response system and method Download PDF

Info

Publication number
CN111488370B
CN111488370B CN202010256172.0A CN202010256172A CN111488370B CN 111488370 B CN111488370 B CN 111488370B CN 202010256172 A CN202010256172 A CN 202010256172A CN 111488370 B CN111488370 B CN 111488370B
Authority
CN
China
Prior art keywords
paging
request
list
screening
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010256172.0A
Other languages
Chinese (zh)
Other versions
CN111488370A (en
Inventor
李�杰
邹初建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou DPTech Technologies Co Ltd
Original Assignee
Hangzhou DPTech Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou DPTech Technologies Co Ltd filed Critical Hangzhou DPTech Technologies Co Ltd
Priority to CN202010256172.0A priority Critical patent/CN111488370B/en
Publication of CN111488370A publication Critical patent/CN111488370A/en
Application granted granted Critical
Publication of CN111488370B publication Critical patent/CN111488370B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/242Query formulation
    • G06F16/2428Query predicate definition using graphical user interfaces, including menus and forms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2453Query optimisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management

Abstract

The present disclosure provides a list paging quick response system, comprising: the screening component is used for screening the current paging request sent by the client device and judging whether the screening condition of the current paging request is the same as the screening condition of the previous paging request or not; the inquiring component inquires a required data set based on the screening condition of the current paging request when the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request; the background paging inquiry component directly executes next paging inquiry based on the screening condition same as the screening condition of the current paging request after judging that the screening condition of the current paging request is the same as the screening condition of the previous paging request or after inquiring the data set required by the component based on the screening condition of the current paging request; the paging cache unit caches the data set queried by the background paging query component; and the list paging sending component directly reads the result in the paging caching unit and feeds back the result to the client device sending the current paging request when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request.

Description

List paging quick response system and method
Technical Field
The present disclosure relates to a list paging quick response system and method.
Background
Big data refers to a data set which cannot be captured, managed and processed by conventional tools within a certain time range, and is a massive, high-growth-rate and diversified information asset which needs a new processing mode to have stronger decision-making ability, insight discovery ability and flow optimization ability. With the explosion of computer technology and network technology, the value of data as an information carrier is becoming more and more important.
Because of these particularities of big data, the presentation mode is not monotonous and simple any more, and various visualization techniques have also been developed. The interactive interface is a channel for information exchange between a person and the computer, the user inputs information to the computer through the interactive interface and operates the computer, and the computer provides information for the user through the interactive interface for reading, analysis and judgment. The page form data display mode is the most original and widely applied data display mode, and the data display mode is still important under the large data background.
FIG. 1 is a flow chart of a method for presenting commonly used tabular data. As shown in fig. 1, in step S110, the client device transmits a list paging query request to the server according to the user query request and the list paging query setting or clicking on the [ next page ]. The list paging is a data presentation mode, which presents data to be presented on a page in the form of a table. The form may be fixed by means of user settings or the like to show a certain amount of data at most at a time, but may show more pages of data in the form of page turning. List paging is generally divided into true paging and false paging, where true paging refers to the process of paging a data list, in which a paging action is performed by a server, and only the data to be presented by the current page is returned at a time. The dummy paging refers to that when the data list is presented, the server returns all data, the paging action is executed by the browser, and the data of fixed items are presented each time. Thus, the list page query request sent by the client device should generally include information such as the manner of list paging and the number of currently requested list pages.
Next, in step S120, the server determines, according to the received list paging query request, a data list paging mode required by the list paging query request, queries data in a background according to a page number given by the list paging query request in the true paging mode so as to query the data in the page, directly but not in a page by the list paging query request in the false paging mode, and returns all the queried data to the client device so that the client device browser itself performs a paging operation to display the data of the fixed entry each time.
If step S120 determines that the list page data is truly paged, the process passes to step S130. In step S130, the server background queries a certain page of list query data according to the list paging query request, and returns the queried page of data to the client device.
If step S120 determines that the list page data is pseudo-paged, the process passes to step S140, where all list data is queried in the background according to the list page query request, returning all queried data to the client device.
Next, in step S150, the single page list data obtained in step S130 or all the query data obtained in step S140 is page-rendered according to the paging manner at the client device so that the query data is visually presented to the user through the user interface.
The interactive paging presentation in the current big data mode mostly adopts a true paging mode in a conventional mode, which can not only rapidly influence, but also save the resource use of a server and a browser. The true page generally initiates a page query request by the client device, and after receiving the request, the server returns the current data set to be displayed according to the search condition. When the client device requests the next page paging query request, the server queries the data required by the next page according to the retrieval condition and merges and returns.
FIG. 2 illustrates a flow chart of an interactive page rendering method using true pages. As shown in fig. 2, in the interactive page presentation method, a query request for first page data of a list is issued to a server by a client device in step S210.
Then, in step S220, the server queries the requested first page data according to the received client device query request, and returns the query result to the client device, so as to perform page rendering there and visually present to the user through the user interface.
Next, in order to obtain the query result of the second page and/or its subsequent page of the list, in step S230, steps 210 and S220 are repeated between the client device and the server with a request for data of the second page and/or its subsequent page of the list until all the data desired to be queried is obtained.
In this list page data query mode, the server responds to subsequent page query requests passively, except for the active return of the required data set at the time of the initial page (first page) load, i.e., the server returns list page query data only when there is a corresponding request from the client device.
In general, this is a very conventional and reasonable implementation, but in a big data mode, the data base increases, so that the probability of clicking [ next page ] is high, and if the implementation is still adopted, the server data query influence time becomes long, so that a less friendly interaction experience is generated for the client device.
Therefore, a technical solution is needed to solve the problem of how to ensure good interaction experience of users when the data base is large and the clicking probability of the next page is high, and effectively reduce the resource consumption of the client device and the server.
Disclosure of Invention
The technical scheme is provided for solving the technical problems, and the aim is to automatically search the data of the next page in the background and put the data set into a cache after the server searches the corresponding data set according to the screening condition based on the condition that the click probability is high; when the client device requests the next page of data, the data set in the cache is returned immediately, and the data of the next page is retrieved and put into the cache operation, and so on. Further, when the search condition changes, the current page data search and the next page data cache are performed again.
According to one aspect of the present disclosure, there is provided a list paging quick response system deployed in a server, including: the screening component is used for screening the current paging request sent by the client device and judging whether the screening condition of the current paging request is the same as the screening condition of the previous paging request or not; the inquiring component inquires a required data set based on the screening condition of the current paging request when the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request; the background paging inquiry component directly executes next paging inquiry based on the screening condition same as the screening condition of the current paging request after judging that the screening condition of the current paging request is the same as the screening condition of the previous paging request or after inquiring the data set required by the component based on the screening condition of the current paging request; the paging cache unit caches the data set queried by the background paging query component; and the list paging sending component directly reads the result in the paging caching unit and feeds back the result to the client device sending the current paging request when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request.
According to the list paging quick response system of the disclosure, when the filtering component is blank, the filtering component judges that the filtering condition of the current paging request is different from the filtering condition of the previous paging request.
The list paging quick response system according to the present disclosure further includes: and the data quantity accumulating component accumulates the data quantity of the paging cache unit which is searched by the background paging search component every time, and the server empties the paging cache unit when the accumulated data quantity is equal to the total quantity of the data set of the searched object.
According to the list paging quick response system disclosed by the disclosure, the screening component stores the screening condition of the previous paging request as the screening condition of the current paging request when judging that the screening condition of the current paging request is different from the screening condition of the previous paging request.
The list page quick response system according to the present disclosure, wherein each page request further comprises a list page display setting in which the amount of data that can be presented per list page is set.
According to another aspect of the present disclosure, there is also provided a list paging quick response method, including: the method comprises the steps of screening a current paging request sent by customer equipment through a screening component, and judging whether the screening condition of the current paging request is the same as the screening condition of a previous paging request; when the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request, the query component queries the required data set based on the screening condition of the current paging request; when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request or after the querying component queries the required data set based on the screening condition of the current paging request, the next paging query is directly executed based on the screening condition which is the same as the screening condition of the current paging request; caching the data set queried by the background paging query component through a paging caching unit; and when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request, the list paging sending component directly reads the result in the paging cache unit and feeds back the result to the client device sending the current paging request.
According to the list paging quick response method disclosed by the disclosure, when the screening condition of the previous paging request is blank, the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request.
The list paging quick response method according to the present disclosure further includes: and accumulating the data quantity of the paging cache unit which is searched by the background paging search component every time through the data quantity accumulation component, and informing a server to empty the paging cache unit when the accumulated data quantity is equal to the total quantity of the data set of the searched object.
According to the list paging quick response method disclosed by the invention, the screening component stores the screening condition of the previous paging request as the screening condition of the current paging request when judging that the screening condition of the current paging request is different from the screening condition of the previous paging request.
According to the list paging quick response method, each paging request further comprises a list paging display setting, wherein the data quantity which can be displayed by each list paging is set.
According to another aspect of the present disclosure, there is further provided an interactive list paging quick response system, including a client device and a server, wherein the client device generates a list paging query request according to a list paging query filtering condition input by a user and a list paging display setting or clicking on a [ next page ] button, sends the generated list paging query request to the server, and performs page rendering according to a list paging query result returned by the server to present list data, and wherein the server compares a currently received query filtering condition in the list paging query request with a previously stored or default query filtering condition, saves the latest query filtering condition, queries first page data in the requested list paging data according to the filtering condition and the list paging display setting included in the received list paging query request, returns the search result to the client device to perform page rendering therein, and automatically continues to search for next page data in the background, so that when the server changes the list filtering condition and the list paging display setting included in the received list query request sent by the client device, only returns the next page data to the client device without changing the next page data.
Preferably, the list page display settings include the amount of data that can be presented per list page.
Preferably, the list paging query request includes only a click on the [ next page ] button.
Preferably, the server defaults to a null prior to query screening condition when receiving the list paging query request of the client device for the first time.
According to one aspect of the present disclosure, there is provided a method of performing an interactive list paging quick response in an interactive list paging quick response system, the interactive list paging quick response system including a client device and a server, wherein the client device generates a list paging query request according to a list paging query filtering condition input by a user and a list paging display setting or a click on a [ next ] button, sends the generated list paging query request to the server, and performs page rendering according to a list paging query result returned by the server to present list data, and wherein the server compares a query filtering condition in the currently received list paging query request with a previously stored or default query filtering condition, saves a latest query filtering condition, queries first data in the requested list paging data according to a filtering condition included in the received list paging query request and a list paging display setting, returns a search result to the client device to perform a page there, and automatically continues to search for next page data in the background according to the list paging request received by the client device, and when the server does not change the list paging request included in the list paging request is not included in the list paging request, the method comprising: 1) Generating a list paging query request by the client device according to the query requirement of the user and the list paging display setting, and sending the generated list paging query request to the server; 2) The method comprises the steps that a server is initialized to reserve a list paging inquiry request storage area and a list paging inquiry result storage cache area in the server, wherein the list paging inquiry request storage area at least stores screening conditions in a list paging inquiry request, the screening conditions are initialized to be empty, and the list paging inquiry result storage cache area is used for storing at least one page of list paging inquiry result data; 3) When a server receives a list paging query request of a client device, judging whether query screening conditions of the current list paging query request of the client device are changed or not; 4) If the screening condition in the list paging query request is different from the screening condition which is already stored, the server updates the screening condition stored in the list paging query request storage area by the received query screening condition, queries the first page data in the requested list paging data according to the current screening condition and returns the first page retrieval result to the client device, and then automatically continues to retrieve the data of the next page in the background and stores the data in the list paging query result storage cache area; or 4') if the screening condition in the list paging query request is the same as the screening condition which is already stored, the server directly returns the next page query result data stored in the list paging query result storage buffer area to the client device without query, and automatically continues to search the next page data in the background; 5) The client device receives the data set returned by the server in the step 4) or 4') and performs page rendering, and then, if the next page query data exists, an action instruction of clicking the [ next page ] button is sent to the server until the client device no longer sends the request for the next page list paging data or reaches the last page data.
According to the technical scheme, when the client device requests and renders the next page data, the server has cache data, so that the response speed is greatly improved, and a good interaction effect of the client device is ensured.
Drawings
The disclosure may be better understood by describing exemplary embodiments thereof in conjunction with the accompanying drawings, in which:
FIG. 1 is a flow chart of a method of presenting tabular data in common use;
FIG. 2 illustrates a flow chart of an interactive page rendering method using true pages;
FIG. 3 illustrates a block diagram of an interactive list paging quick response system, according to one embodiment of the present disclosure;
FIG. 4 illustrates a flow chart of an interactive list paging quick response method according to one embodiment of the present disclosure; and
FIG. 5 illustrates a flow chart of an interactive list paging quick response method of querying a system operation log according to one embodiment of the present disclosure.
Detailed Description
In the following, specific embodiments of the present disclosure will be described, and it should be noted that in the course of the detailed description of these embodiments, it is not possible in the present specification to describe all features of an actual embodiment in detail for the sake of brevity. It should be appreciated that in the actual implementation of any of the implementations, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Unless defined otherwise, technical or scientific terms used in the claims and specification should be given the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The terms "first," "second," and the like in the description and in the claims, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, is intended to mean that elements or items that are immediately preceding the word "comprising" or "comprising", are included in the word "comprising" or "comprising", and equivalents thereof, without excluding other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, nor to direct or indirect connections.
In a system and method for querying and presenting big data, paging presentation of a list of queried data through the list is a common way. The list paging is a data presentation mode, which presents data to be presented on a page in the form of a table. The form may be fixed by means of user settings or the like to show a certain amount of data at most at a time, but may show more pages of data in the form of page turning. Such data query systems typically interact directly with the client device portion of the user and the server portion that provides the query service. In the way of presenting query data in a list paging way, the client device sends a list paging query request to the server according to the user query request and the list paging query setting or clicking on the [ next page ] button. List paging is generally classified into true paging and false paging. The true paging refers to that when the data list is presented, the paging action is executed by the server, and only the data to be presented in the current page is returned at a time. The dummy paging refers to that when the data list is presented, the server returns all data, the paging action is executed by the browser, and the data of fixed items are presented each time. Thus, the list page query request sent by the client device should generally include information such as the manner of list paging and the number of currently requested list pages.
The interactive paging presentation in the current big data mode mostly adopts a true paging mode, which can not only rapidly influence, but also save the resource use of a server and a browser. In accordance with one aspect of the present disclosure, the interactive list paging quick response system and method of the present invention is an improvement over the list paging query request mode in the true paging mode of a client device.
Fig. 3 illustrates a block diagram of an interactive list paging quick response system, according to one embodiment of the present disclosure. As shown in fig. 3, the interactive list paging quick response system includes a client device 310 and a server 300. A list paging quick response system 320 according to the present disclosure is deployed in the server. The method comprises the following steps: a filtering component 321, a querying component 322, a background paging querying component 323, a paging cache unit 324, and a list paging sending component 325. The filtering component 321 filters the current paging request sent by the client device 310, and determines whether the filtering condition of the current paging request is the same as the filtering condition of the previous paging request. The query component 322 queries the desired data set based on the screening criteria of the current paging request when the screening component 321 determines that the screening criteria of the current paging request is different from the screening criteria of the previous paging request. The background paging query component 323 directly executes the next paging query based on the same filtering condition as the filtering condition of the current paging request after the filtering component 321 determines that the filtering condition of the current paging request is the same as the filtering condition of the previous paging request or the query component 322 queries the required data set based on the filtering condition of the current paging request. The page buffer unit 324 buffers the data set queried by the background page query component 323. The list page sending component 325 directly reads the result in the page buffer unit 324 and feeds back to the client device sending the current page request when the filtering component 321 determines that the filtering condition of the current page request is the same as the filtering condition of the previous page request.
The client device 310 is configured to generate a list paging query request according to a list paging query filtering condition input by a user, a setting of a list paging display device, or a click on a [ next page ] button (e.g. including a displayable data amount of each list paging), send the generated list paging query request to the server 320, so that the server 320 queries according to the list paging query request and returns a list paging query result, and perform page rendering according to the list paging query result returned by the server 320 to present list data. Further, in the case where the user needs to further browse the data of the next page in the list paging data, the client device 310 transmits the next page data requesting the list paging query result to the server 320 according to an action or instruction such as the user pressing the [ next page ] button, and performs page rendering on the next page of the list paging query result returned by the server 320 to present the list data. This process continues until the user of the client device 310 no longer needs or there is no further browsing of the next page of list page data as a result of the query, i.e., the client device 310 no longer sends a list page query request.
On the other hand, with further reference to fig. 3, when receiving the list paging query request sent by the client device 310, the server 320 in the interactive list paging quick response system compares the query filtering condition of the current list paging query request with the query filtering condition stored in the previous time, if the comparison result indicates that the filtering condition in the current query request has changed, replaces the filtering condition stored in the previous time with the filtering condition in the current query request, and queries the first page data in the requested list paging data according to the filtering condition included in the received list paging query request and the list paging display setting or clicking on the [ next page ] button, etc., and returns the search result to the client device 310 for page rendering therein. That is, the filtering component 321 saves the filtering condition of the previous paging request as the filtering condition of the current paging request when it is determined that the filtering condition of the current paging request is different from the filtering condition of the previous paging request. In addition, server 320 automatically continues to retrieve its next page data in the background for the corresponding request by client device 310 after returning the retrieved first page data, so that when server 320 simply requests its next page data without changing the list page display settings according to the query screening conditions contained in the received list page query request sent by client device 310, it returns its cached next page data to client device 310 and automatically continues to retrieve its next page data in the background for the corresponding request by client device 310. This process continues until client device 310 no longer sends a request for next page list paging data or reaches the last page of data. Accordingly, the system of the present disclosure may further include a data amount accumulating component 326 for accumulating the amount of data that the background page query component 323 is querying to cache the page cache unit 324 each time, and for flushing the page cache unit 324 by the server 300 when the accumulated amount of data is equal to the total amount of the queried object data set.
In accordance with one embodiment of the present disclosure, where client device 310 does not change the screening criteria but only requests the next page of query results, the query request may not include the screening criteria. Accordingly, when the server 320 checks that the query filtering condition of the current list paging query request is empty, it defaults that the filtering condition is changed. That is, the filtering component 321 determines that the filtering condition of the current paging request is different from the filtering condition of the previous paging request when the filtering condition of the previous paging request is blank. Thus, if the next page data request and rendering by the client device 310 occurs, since the server 320 has retrieved and stored the data required by the request before the client device 310 issues a further request, the response speed of the server 320 to the request by the client device 310 is greatly increased, thereby ensuring that the client device 310 can have a good interaction effect. Meanwhile, since the interactive list paging quick response system adopts the true paging mode, resources used by the client device 310 (more specifically, the browser of the client device 310) are relieved, and the server 320 performs paging tasks in the background, so that a certain relief effect is provided for the resource occupation of the server 320.
Fig. 4 illustrates a flow chart of an interactive list paging quick response method according to one embodiment of the present disclosure. As shown in fig. 4, in the interactive list paging quick response method, in step S410, the client device 310 (see fig. 3) generates a list paging query request according to a user' S query request and a list paging display setting or clicking on a [ next page ] button or the like (including, for example, a data amount that can be exhibited by list paging), and transmits the generated list paging query request to the server 320 (see fig. 3) so that the server 320 queries according to the list paging query request and returns a list paging query result.
Next, in step S420, the server 320 (see fig. 3) performs initialization to keep a list paging query request holding area and a list paging query result holding cache area in the server. The list page query request holding area holds at least a filter condition in the list page query request, which is initialized to null. The list paging query result holding buffer is used for holding at least one page of list paging query result data.
In step S430, when the server 320 receives the list paging query request of the client device 310, it is determined whether the query filtering condition of the current list paging query request of the client device 310 is changed, if the filtering condition in the list paging query request is different from the saved filtering condition, the process is transferred to step S460, otherwise, the process is transferred to step S440.
In the case where the query filtering condition of the present list paging query request is changed from the previous query filtering condition, the process proceeds to step S460, in which the server 320 updates the filtering condition stored in the list paging query request storage area with the query filtering condition of the present list paging query request contained in the received list paging query request, queries the first page data in the requested list paging data according to the filtering condition contained in the received list paging query request and returns the first page retrieval result to the client device 310 (see fig. 3) for page rendering there, and then automatically continues to retrieve the data of the next page thereof in the background and stores in the list paging query result storage cache area for the corresponding request of the client device 310. As shown in fig. 4, step S450 follows immediately step S460, which will be described in detail below.
In step S450, the client device 310 receives the data set returned by the server 320 and performs page rendering, and then, if there is next page query data, sends an action instruction of clicking the [ next page ] button to the server.
Then, if it is determined in step S430 that the query screening condition of the current list paging query request has not changed from the previous query screening condition, the process proceeds to step S440. In step S440, the query filtering condition included in the list paging query request received by the server 320 (see fig. 3) and transmitted by the client device 310 is not changed, but only the next page data thereof is requested, so that the server 320 (see fig. 3) transmits the next page query result data stored in the cache to the client device 310 (see fig. 3) directly without being queried, so that the client device 310 performs page rendering on the next page of the list paging query result returned by the server 320 to present the list data, and automatically continues to retrieve the next page data thereof in the background for the corresponding request of the client device 310.
As shown in fig. 4, step S450 is also subsequent to step S440. The process of step S440 to step S450 may be repeated a number of times until the client device 310 no longer transmits the request for next page list paging data or reaches the last page data.
FIG. 5 illustrates a flow chart of an interactive list paging quick response method of querying a system operation log according to one embodiment of the present disclosure. In the method of querying the system operation log shown in fig. 5, in step S510, when the user initiates a list-paged query of the system operation log data, the client device 310 (refer to fig. 3) generates a list-paged query request regarding the query system operation log according to a default filtering condition of the query system operation log, and transmits the list-paged query request to the server 320 (refer to fig. 3) to start the system operation log query therein.
Next, in step S520, the server 320 (see fig. 3) performs initialization to keep a list paging query request holding area and a list paging query result holding buffer area in the server. The list page query request holding area holds at least a filter condition in the list page query request, which is initialized to null. The list paging query result holding buffer is used for holding at least one page of list paging query result data.
In step S530, after receiving the list paging query request for the query system operation log, the server 320 determines whether the filtering condition of the query system operation log of the client device has changed, and when it is determined that the filtering condition of the query system operation log has changed, the process shifts to step S560.
It should be noted that, since the filtering condition defaults to null before the server 320 first receives the list paging query request of the client device 310, when the list paging query request is first received, step S530 determines that there is a change in the filtering condition, so as to perform a dataset query, and obtain a response dataset satisfying the filtering condition.
In step S560, the server 320 first updates to the latest query screening condition to facilitate the next comparison of the query screening conditions. The server 320 then initiates a background paging query, returning the queried current page list paging data to the client device 310, so that the client device 310 performs page rendering on the data set returned by the server 320. The server 320 then obtains the next page data query results in the background and places the data query results for the next page in a cache for transmission to the client device 310 in response to a further list page query request from the client device 310 that does not change the filtering criteria. As shown in fig. 5, immediately after step S560 is step S550, which will be described in detail below.
Returning to step S520, if the server determines that the filtering condition of the operation log of the current query system is consistent with the filtering condition of the operation log of the previous query system, the process is transferred to step S540, where the next page data query result cached in step S530 is returned to the client device 310, and the background paging query is performed according to the recorded filtering condition, so as to obtain the next page data query result thereof, and the data query result about the next page is put into the cache, so as to be sent to the client device 310 for page rendering according to the further list paging query request of the client device 310 without changing the filtering condition.
Next, in step S550, the client device 310 receives the data set returned by the server and performs page rendering, and determines whether the current page is the last page and whether the query filtering condition is changed, and if the current rendered page is the last page and the query filtering condition is not changed, disables the [ next page ] button to stop sending the list paging query request to the server, thereby ending the list paging query.
If the client device determines in step S550 that the current page is not the last page and the query screening condition has not changed, the [ next page ] button is enabled to send a list paging query request to the server to request next page list data.
If the client device determines in step S550 that the current page is not the last page, but the query screening condition has not changed, the [ next page ] button is enabled to send a list paging query request to the server requesting new list paging data.
In this way, when the client device requests the last page of data, the server background paging task judges that the next page of data is not available, the data set is not queried any more, the cache is directly emptied, and meanwhile, the client device can not send the next page of request any more until the client device does not make a new request any more, and the process is ended.
In summary, the method mainly comprises the following steps: based on the fact that the clicking probability is high, after the server searches the corresponding data set according to the screening conditions, the server automatically searches the data of the next page in the background and places the data set into a cache; when the client device requests the next page of data, the data set in the cache is returned immediately, and the data retrieval and the cache putting operation of the next page are performed again, and so on. When the search condition changes, the current page data search and the next page data cache are carried out again.
In the process, when the client device requests and renders the next page of data, the server has cache data, so that the response speed is greatly improved, the good interaction effect of the client device is ensured, meanwhile, due to the true paging mode, the browser resource use is relieved, and the paging task of the server background also plays a certain role in relieving the server resource occupation.
While the basic principles of the present disclosure have been described above in connection with specific embodiments, it should be noted that all or any steps or portions of the methods and systems of the present disclosure can be implemented in hardware, firmware, software, or combinations thereof in any computing device (including processors, storage media, etc.) or network of computing devices, as would be apparent to one of ordinary skill in the art upon reading the present disclosure.
Thus, the objects of the present disclosure may also be achieved by running a program or set of programs on any computing device. The computing device may be a well-known general purpose device. Thus, the objects of the present disclosure may also be achieved by simply providing a program product containing program code for implementing the method or system. That is, such a program product also constitutes the present disclosure, and a storage medium storing such a program product also constitutes the present disclosure. It is apparent that the storage medium may be any known storage medium or any storage medium developed in the future.
It should also be noted that in the systems and methods of the present disclosure, it is apparent that portions or steps may be split and/or recombined. Such decomposition and/or recombination should be considered equivalent to the present disclosure. The steps of executing the series of processes may naturally be executed in chronological order in the order described, but are not necessarily executed in chronological order. Some steps may be performed in parallel or independently of each other.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives can occur depending upon design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (10)

1. A list paging quick response system deployed in a server, comprising:
the screening component is used for screening the current paging request sent by the client device and judging whether the screening condition of the current paging request is the same as the screening condition of the previous paging request or not;
the inquiring component inquires a required data set based on the screening condition of the current paging request when the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request;
the background paging inquiry component directly executes next paging inquiry based on the screening condition same as the screening condition of the current paging request after judging that the screening condition of the current paging request is the same as the screening condition of the previous paging request or after inquiring the data set required by the component based on the screening condition of the current paging request;
the paging cache unit caches the data set queried by the background paging query component; and
and the list paging sending component directly reads the next paging in the paging cache unit and feeds back the next paging to the client device sending the current paging request when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request.
2. The list paging quick response system of claim 1, wherein the filtering component determines that the filtering condition of the current paging request is different from the filtering condition of the previous paging request when the filtering condition of the previous paging request is blank.
3. The list paging quick response system of claim 1, further comprising: and the data quantity accumulating component accumulates the data quantity of the paging cache unit which is searched by the background paging search component every time, and the server empties the paging cache unit when the accumulated data quantity is equal to the total quantity of the data set of the searched object.
4. The list paging quick response system of claim 1, wherein the filtering component saves the filtering condition of the previous paging request as the filtering condition of the current paging request when it is determined that the filtering condition of the current paging request is different from the filtering condition of the previous paging request.
5. The list paging quick response system of claim 1, wherein each paging request further comprises a list paging display setting in which an amount of data that can be presented per list paging is set.
6. A list paging quick response method comprises the following steps:
the method comprises the steps of screening a current paging request sent by customer equipment through a screening component, and judging whether the screening condition of the current paging request is the same as the screening condition of a previous paging request;
when the screening component judges that the screening condition of the current paging request is different from the screening condition of the previous paging request, the query component queries the required data set based on the screening condition of the current paging request;
when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request or after the querying component queries the required data set based on the screening condition of the current paging request, the next paging query is directly executed based on the screening condition which is the same as the screening condition of the current paging request;
caching the data set queried by the background paging query component through a paging caching unit; and
and when the screening component judges that the screening condition of the current paging request is the same as the screening condition of the previous paging request, the list paging sending component directly reads the next paging in the paging cache unit and feeds back the next paging to the client device sending the current paging request.
7. The method of claim 6, wherein the filtering component determines that the filtering condition of the current page request is different from the filtering condition of the previous page request when the filtering condition of the previous page request is blank.
8. The list paging quick response method as claimed in claim 6, further comprising:
and accumulating the data quantity of the paging cache unit which is searched by the background paging search component every time through the data quantity accumulation component, and informing a server to empty the paging cache unit when the accumulated data quantity is equal to the total quantity of the data set of the searched object.
9. The method of claim 6, wherein the filtering component saves the filtering condition of the previous page request as the filtering condition of the current page request when it determines that the filtering condition of the current page request is different from the filtering condition of the previous page request.
10. The list page quick response method of claim 6, wherein each page request further comprises a list page display setting in which an amount of data that can be presented per list page is set.
CN202010256172.0A 2020-04-02 2020-04-02 List paging quick response system and method Active CN111488370B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010256172.0A CN111488370B (en) 2020-04-02 2020-04-02 List paging quick response system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010256172.0A CN111488370B (en) 2020-04-02 2020-04-02 List paging quick response system and method

Publications (2)

Publication Number Publication Date
CN111488370A CN111488370A (en) 2020-08-04
CN111488370B true CN111488370B (en) 2023-09-12

Family

ID=71791675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010256172.0A Active CN111488370B (en) 2020-04-02 2020-04-02 List paging quick response system and method

Country Status (1)

Country Link
CN (1) CN111488370B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112699147A (en) * 2020-12-31 2021-04-23 京东数字科技控股股份有限公司 Paging query method, device, equipment and storage medium

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5721956A (en) * 1995-05-15 1998-02-24 Lucent Technologies Inc. Method and apparatus for selective buffering of pages to provide continuous media data to multiple users
US7313656B1 (en) * 2004-12-27 2007-12-25 Emc Corporation Pre-fetch prediction method for disk drives
CN101848231A (en) * 2010-03-08 2010-09-29 深圳市同洲电子股份有限公司 Method and system for data transmission
CN101860449A (en) * 2009-04-09 2010-10-13 华为技术有限公司 Data query method, device and system
CN102222086A (en) * 2011-05-18 2011-10-19 广州市动景计算机科技有限公司 Webpage viewing method and webpage viewing device based on mobile terminal as well as mobile terminal
CN102880685A (en) * 2012-09-13 2013-01-16 北京航空航天大学 Method for interval and paging query of time-intensive B/S (Browser/Server) with large data size
CN103810173A (en) * 2012-11-06 2014-05-21 深圳市金蝶中间件有限公司 Paging data processing method and system
CN103995807A (en) * 2013-02-16 2014-08-20 长沙中兴软创软件有限公司 Massive data query and secondary processing method based on Web architecture
CN104067215A (en) * 2012-01-25 2014-09-24 微软公司 Presenting data driven forms
CN105045932A (en) * 2015-09-02 2015-11-11 南京邮电大学 Data paging inquiry method based on descending order storage
CN105468644A (en) * 2014-09-10 2016-04-06 阿里巴巴集团控股有限公司 Method and device for performing query in database
CN107016045A (en) * 2017-02-17 2017-08-04 阿里巴巴集团控股有限公司 A kind of method and device of paged data inquiry
WO2017146875A1 (en) * 2016-02-26 2017-08-31 Honeywell International Inc. System and method for smart event paging
CN107291718A (en) * 2016-03-30 2017-10-24 阿里巴巴集团控股有限公司 Page resource put-on method and device
CN108228663A (en) * 2016-12-21 2018-06-29 杭州海康威视数字技术股份有限公司 A kind of paging search method and device
CN108984623A (en) * 2018-06-14 2018-12-11 东软集团股份有限公司 Data query conditions generation method, device, storage medium and electronic equipment
CN109766487A (en) * 2018-12-26 2019-05-17 郑州云海信息技术有限公司 The method and device of page access anticipation is carried out based on middleware
CN109947827A (en) * 2018-12-27 2019-06-28 航天信息股份有限公司 A kind of response method and device of inquiry operation
CN110674369A (en) * 2019-09-23 2020-01-10 杭州迪普科技股份有限公司 Data query method and device
CN110737857A (en) * 2019-09-11 2020-01-31 苏州浪潮智能科技有限公司 back-end paging acceleration method, system, terminal and storage medium
CN110928900A (en) * 2018-09-17 2020-03-27 马上消费金融股份有限公司 Multi-table data query method, device, terminal and computer storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6925594B2 (en) * 2001-02-28 2005-08-02 International Business Machines Corporation Saving selected hyperlinks for retrieval of the hyperlinked documents upon selection of a finished reading button in a web browser
US7659905B2 (en) * 2006-02-22 2010-02-09 Ebay Inc. Method and system to pre-fetch data in a network
US9703829B2 (en) * 2011-12-26 2017-07-11 Hitachi, Ltd. Database system and database management method
US9195601B2 (en) * 2012-11-26 2015-11-24 International Business Machines Corporation Selective release-behind of pages based on repaging history in an information handling system
US10482083B2 (en) * 2015-10-07 2019-11-19 Capital One Services, Llc Automated sequential site navigation
US10817219B2 (en) * 2018-09-12 2020-10-27 Apple Inc. Memory access scheduling using a linked list

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5721956A (en) * 1995-05-15 1998-02-24 Lucent Technologies Inc. Method and apparatus for selective buffering of pages to provide continuous media data to multiple users
US7313656B1 (en) * 2004-12-27 2007-12-25 Emc Corporation Pre-fetch prediction method for disk drives
CN101860449A (en) * 2009-04-09 2010-10-13 华为技术有限公司 Data query method, device and system
CN101848231A (en) * 2010-03-08 2010-09-29 深圳市同洲电子股份有限公司 Method and system for data transmission
CN102222086A (en) * 2011-05-18 2011-10-19 广州市动景计算机科技有限公司 Webpage viewing method and webpage viewing device based on mobile terminal as well as mobile terminal
CN104067215A (en) * 2012-01-25 2014-09-24 微软公司 Presenting data driven forms
CN102880685A (en) * 2012-09-13 2013-01-16 北京航空航天大学 Method for interval and paging query of time-intensive B/S (Browser/Server) with large data size
CN103810173A (en) * 2012-11-06 2014-05-21 深圳市金蝶中间件有限公司 Paging data processing method and system
CN103995807A (en) * 2013-02-16 2014-08-20 长沙中兴软创软件有限公司 Massive data query and secondary processing method based on Web architecture
CN105468644A (en) * 2014-09-10 2016-04-06 阿里巴巴集团控股有限公司 Method and device for performing query in database
CN105045932A (en) * 2015-09-02 2015-11-11 南京邮电大学 Data paging inquiry method based on descending order storage
WO2017146875A1 (en) * 2016-02-26 2017-08-31 Honeywell International Inc. System and method for smart event paging
CN107291718A (en) * 2016-03-30 2017-10-24 阿里巴巴集团控股有限公司 Page resource put-on method and device
CN108228663A (en) * 2016-12-21 2018-06-29 杭州海康威视数字技术股份有限公司 A kind of paging search method and device
CN107016045A (en) * 2017-02-17 2017-08-04 阿里巴巴集团控股有限公司 A kind of method and device of paged data inquiry
CN108984623A (en) * 2018-06-14 2018-12-11 东软集团股份有限公司 Data query conditions generation method, device, storage medium and electronic equipment
CN110928900A (en) * 2018-09-17 2020-03-27 马上消费金融股份有限公司 Multi-table data query method, device, terminal and computer storage medium
CN109766487A (en) * 2018-12-26 2019-05-17 郑州云海信息技术有限公司 The method and device of page access anticipation is carried out based on middleware
CN109947827A (en) * 2018-12-27 2019-06-28 航天信息股份有限公司 A kind of response method and device of inquiry operation
CN110737857A (en) * 2019-09-11 2020-01-31 苏州浪潮智能科技有限公司 back-end paging acceleration method, system, terminal and storage medium
CN110674369A (en) * 2019-09-23 2020-01-10 杭州迪普科技股份有限公司 Data query method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于ASP.NET缓存与分页策略优化Web数据查询性能;董一华;;计算机时代(第09期);全文 *

Also Published As

Publication number Publication date
CN111488370A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
CN106844740B (en) Data pre-reading method based on memory object cache system
US9948531B2 (en) Predictive prefetching to reduce document generation times
CN109145020A (en) Information query method, from server, client and computer readable storage medium
EP3742306A1 (en) Data query method, apparatus and device
US20120060083A1 (en) Method for Use in Association With A Multi-Tab Interpretation and Rendering Function
CN111782692B (en) Frequency control method and device
CN102957712A (en) Method and system for loading website resources
CN109167840B (en) Task pushing method, node autonomous server and edge cache server
CN102307234A (en) Resource retrieval method based on mobile terminal
US20150379143A1 (en) Method and system for preparing website data in response to a webpage request
US11032394B1 (en) Caching techniques
CN106815260A (en) A kind of index establishing method and equipment
CN111488370B (en) List paging quick response system and method
CN110032578B (en) Mass data query caching method and device
CN111913917A (en) File processing method, device, equipment and medium
US9934068B2 (en) Data analysis system
CN110784498B (en) Personalized data disaster tolerance method and device
CN107181773A (en) Data storage and data managing method, the equipment of distributed memory system
CN110928900A (en) Multi-table data query method, device, terminal and computer storage medium
CN112541119A (en) Efficient and energy-saving small recommendation system
CN110222046B (en) List data processing method, device, server and storage medium
CN113806651A (en) Data caching method, device, server and storage medium
US20020007394A1 (en) Retrieving and processing stroed information using a distributed network of remote computers
AU2001250169A1 (en) Retrieving and processing stored information using a distributed network of remote computers
CN112488803A (en) Favorite storage access method and device, equipment and medium thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant