CN111143414A - Feedback method and device of cache data, electronic equipment and storage medium - Google Patents

Feedback method and device of cache data, electronic equipment and storage medium Download PDF

Info

Publication number
CN111143414A
CN111143414A CN201911366361.7A CN201911366361A CN111143414A CN 111143414 A CN111143414 A CN 111143414A CN 201911366361 A CN201911366361 A CN 201911366361A CN 111143414 A CN111143414 A CN 111143414A
Authority
CN
China
Prior art keywords
data
feedback
cache
client
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911366361.7A
Other languages
Chinese (zh)
Inventor
付睿
戚海洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuba Co Ltd
Original Assignee
Wuba Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuba Co Ltd filed Critical Wuba Co Ltd
Priority to CN201911366361.7A priority Critical patent/CN111143414A/en
Publication of CN111143414A publication Critical patent/CN111143414A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2455Query execution
    • G06F16/24552Database cache management

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application provides a feedback method and device of cache data, electronic equipment and a storage medium. If the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, first sending first feedback data to the client, and then sending second feedback data to the client, wherein the second feedback data is updated cache data. Therefore, the feedback method of the cache data provided by the application can fill up the time difference of the update data by firstly returning the existing cache data, namely the first feedback data, so that a user can obtain the data uninterruptedly, and the data display waiting is avoided.

Description

Feedback method and device of cache data, electronic equipment and storage medium
Technical Field
The present application relates to the field of data processing, and in particular, to a feedback method and apparatus for cache data, an electronic device, and a storage medium.
Background
When a user uses an application program, the user needs to continuously obtain corresponding display data from a data storage medium of the application program according to a requirement, for example, when the user needs to obtain road condition display information, the user needs to send a road condition display information obtaining request to the application program, and at this time, the application program obtains corresponding map data, vehicle data and the like from a database of the application program according to the request. However, if data is obtained from the database for each data request, the number of database interactions is increased, which affects the stability of the database. Especially for some highly concurrent data requests, if relevant data is obtained from the database, the burden of the database is greatly increased, and the data obtaining efficiency is affected.
In order to solve the above problem, a part of used data is stored in a cache library, and when a user initiates a data request again, the cache library is accessed to obtain the data, so that the influence on the database can be relieved to a certain extent. However, since the update is started from the database first when the data is updated, if the data is always acquired from the cache library, a problem easily arises that the acquired data is not the latest data. However, if data is acquired from the cache after the cache is updated, a problem of data acquisition waiting is caused, and the data display efficiency is reduced, so that the user experience is affected.
Disclosure of Invention
The application provides a feedback method and device of cache data, electronic equipment and a storage medium, so as to avoid the problem of data display waiting.
In a first aspect, the present application provides a feedback method for cached data, where the method includes:
receiving a data acquisition request sent by a client, wherein the data acquisition request comprises request time;
calculating the difference value between the request time and the time when the client sends the data acquisition request for the first time to obtain a request time interval;
if the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, sending first feedback data and second feedback data to the client, wherein the first feedback data is the cache data, the second feedback data is updated cache data, the second feedback data is sent after the first feedback data, the sending time interval between the second feedback data and the first feedback data is less than or equal to the display time of the first feedback data at the client, the validity period of the cache data is the time period when the cache data is invalid, and the expiration period of the cache data is the time period when the cache data is cleared.
In a possible implementation manner of the first aspect of the embodiment of the present invention, the sending, to the client, the first feedback data and the second feedback data includes:
acquiring target data from a database according to the data acquisition request, wherein the target data is data matched with the data acquisition request in the database;
replacing the cache data with the target data to obtain updated cache data;
and sending the updated cache data as second feedback data to the client.
In a possible implementation manner of the first aspect of the embodiment of the present invention, the method further includes:
and if the time interval is less than or equal to the validity period of the cache data, sending the first feedback data to the client.
In a possible implementation manner of the first aspect of the embodiment of the present invention, the method further includes:
if the time interval is greater than or equal to the expiration date of the cache data, sequencing preselected data in a database to obtain a first sequence, wherein the preselected data are data which accord with preset preliminary screening conditions in the database;
and browsing each preselected data in the first sequence in combination with the data acquisition request to determine third feedback data, wherein the third feedback data are the preselected data corresponding to the data acquisition request.
In a possible implementation manner of the first aspect of the embodiment of the present invention, the sorting the preselected data in the database to obtain the first sequence includes:
calculating the weight corresponding to each preset sequencing influence dimension by utilizing a sequencing algorithm in combination with the preset sequencing influence dimension;
calculating the sum and average value of the product of each subdata in each piece of preselected data and the corresponding weight to obtain the score of each piece of preselected data;
by ranking the scores, a first sequence consisting of the preselected data is obtained.
In a possible implementation manner of the first aspect of the embodiment of the present invention, after obtaining the first sequence, the method includes:
correspondingly generating an index from the first sequence, wherein the index and the first sequence have a one-to-one correspondence relationship;
and putting the index into a cache for a data processor to determine the corresponding first sequence according to the index.
In a possible implementation manner of the first aspect of the embodiment of the present invention, the method further includes:
and automatically creating a second sequence for an active client to use by taking the expiration date of the cache data as a period, wherein the frequency of sending the data acquisition request is greater than or equal to the preset frequency.
In a possible implementation manner of the first aspect of the embodiment of the present invention, before the calculating a difference between the request time and the time when the client sends the data acquisition request for the first time, the method includes:
acquiring the total number of to-be-processed requests corresponding to the request moment, wherein the to-be-processed requests are data acquisition requests sent by each client;
and if the total number is greater than or equal to a preset number threshold, sending preset feedback data to each client, and processing each request to be processed one by one.
In a second aspect, the present application provides a feedback apparatus for caching data, the apparatus including:
the data acquisition request receiving module is used for receiving a data acquisition request sent by a client, wherein the data acquisition request comprises request time;
a difference value calculating module, configured to calculate a difference value between the request time and a time at which the client first sends the data acquisition request, so as to obtain a request time interval;
a feedback data sending module, configured to send first feedback data and second feedback data to the client if the time interval is greater than an expiration date of cache data and less than an expiration date of the cache data, where the first feedback data is the cache data, the second feedback data is updated cache data, the second feedback data is sent after the first feedback data, a sending time interval between the second feedback data and the first feedback data is less than or equal to a presentation time of the first feedback data at the client, the expiration date of the cache data is a time date when the cache data is invalid, and the expiration date of the cache data is a time date when the cache data is cleared.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the feedback data sending module includes:
the target data acquisition module is used for acquiring target data from a database according to the data acquisition request, wherein the target data is data matched with the data acquisition request in the database;
the replacing module is used for replacing the cache data by using the target data to obtain updated cache data;
and the second feedback data sending module is used for sending the updated cache data serving as second feedback data to the client.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes:
and the first feedback data sending module is used for sending the first feedback data to the client if the time interval is less than or equal to the validity period of the cache data.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes:
the first sequencing module is used for sequencing preselected data in a database to obtain a first sequence if the time interval is greater than or equal to the expiration date of the cache data, wherein the preselected data are data which accord with preset preliminary screening conditions in the database;
and the third feedback data determining module is used for sequentially browsing each preselected data in the first sequence in combination with the data acquisition request to determine third feedback data, wherein the third feedback data are the preselected data corresponding to the data acquisition request.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the first sorting module includes:
the weight calculation module is used for calculating the weight corresponding to each preset sequencing influence dimension by utilizing a sequencing algorithm in combination with the preset sequencing influence dimension;
the adding and averaging value calculating module is used for calculating the adding and averaging value of the product of each subdata in each piece of preselected data and the corresponding weight to obtain the score of each piece of preselected data;
and the first sequence acquisition module is used for obtaining a first sequence consisting of the preselected data by arranging the scores.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes:
the index generating module is used for correspondingly generating an index from the first sequence, and the index and the first sequence have a one-to-one correspondence relationship;
and the index cache module is used for putting the index into a cache so that the data processor determines the corresponding first sequence according to the index.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes:
and the first sequence creating module is used for automatically creating a second sequence for an active client to use by taking the expiration date of the cache data as a period, wherein the frequency of sending the data acquisition request is greater than or equal to the preset frequency.
In a possible implementation manner of the second aspect of the embodiment of the present invention, the apparatus further includes:
a request quantity obtaining module, configured to obtain a total quantity of to-be-processed requests corresponding to the request time, where the to-be-processed requests are data obtaining requests sent by each client;
and the preset feedback data sending module is used for sending preset feedback data to each client and processing each request to be processed one by one if the total number is greater than or equal to a preset number threshold.
In a third aspect, the present application provides an electronic device, comprising:
a processor, and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform any of the cached data feedback methods via execution of the executable instructions.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the above-described methods for feedback of cached data.
According to the above technology, the application provides a feedback method and device for cache data, an electronic device and a storage medium, wherein a server receives a data acquisition request sent by a client, and calculates a difference between a request time and a time when the client sends the data acquisition request for the first time to obtain a request time interval. If the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, first sending first feedback data to the client, and then sending second feedback data to the client, wherein the second feedback data is updated cache data. Therefore, the feedback method of the cache data provided by the application can fill up the time difference of the update data by firstly returning the existing cache data, namely the first feedback data, so that a user can obtain the data uninterruptedly, and the data display waiting is avoided.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without any creative effort.
Fig. 1 is a flowchart of a method for displaying cache data according to an embodiment of the present disclosure;
fig. 2 is a flowchart of a method for sending second feedback data according to an embodiment of the present application;
fig. 3 is a flowchart of a method for determining third feedback data according to an embodiment of the present application;
fig. 4 is a flowchart of a method for obtaining a first sequence according to an embodiment of the present application;
fig. 5 is a flowchart of an index generation method according to an embodiment of the present application;
fig. 6 is a flowchart of a method for requesting wait processing according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a first embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a second embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a third embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a fourth embodiment of a feedback apparatus for buffering data according to the present invention;
fig. 11 is a schematic structural diagram of a fifth embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a sixth embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 13 is a schematic structural diagram of a seventh embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an eighth embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention;
fig. 15 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of a method for displaying cache data according to an embodiment of the present application, where as shown in fig. 1, the method includes:
s1, receiving a data acquisition request sent by the client, wherein the data acquisition request comprises a request time.
The server corresponding to the application program receives data acquisition requests generated by the operation of the user at the client, and generally, the data acquisition requests at least include request time, that is, time when the data acquisition requests are generated, and may also include specific request content, user information, device information, and the like of the data acquisition requests.
And S2, calculating the difference between the request time and the time when the client sends the data acquisition request for the first time to obtain the request time interval.
As known from the background art, the condition that the data can be obtained from the cache library is satisfied, that is, the data obtaining request should be at least a second data obtaining request, so that the cache library can only have the related cache data for obtaining. Of course, the data fetch request may be the first data fetch request, but it is likely that no relevant data can be found in the cache library.
When the client sends a data acquisition request for the first time, the cache library does not have data corresponding to the data acquisition request, at this time, corresponding data needs to be acquired from the database of the server, and after the corresponding data is acquired, the data is correspondingly placed into the cache library for the next data request to be acquired. Taking the currently generated data acquisition request as the current data acquisition request, a time difference, that is, a difference value of request time, that is, the request time interval mentioned in the embodiment of the present application must exist between the current data acquisition request and the data acquisition request initiated for the first time.
S3, if the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, sending a first feedback data and a second feedback data to the client, where the first feedback data is the cache data, the second feedback data is the updated cache data, the second feedback data is sent after the first feedback data, and the sending time interval between the second feedback data and the first feedback data is less than or equal to the display time of the first feedback data at the client, the validity period of the cache data is the time period when the cache data is invalid, and the expiration period of the cache data is the time period when the cache data is cleared.
In general, in order to ensure timeliness of the cached data in the cache library, deletion and update processing of the data in the cache library is required. At this time, an expiration date and an expiration date are set for the cache data, and the start times of the two dates are the time when the data acquisition request is first generated. If the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, it indicates that the initiation time of the current data acquisition request does not exceed the deletion time of the cache data, that is, the cache data still exists in the cache library, but since the current time exceeds the validity period of the cache data, it indicates that the cache data needs to be updated, otherwise, the timeliness of the cache data cannot be met, and therefore, the feedback data finally sent to the client side should be the updated cache data. However, in the process of updating the cache data, a certain time is consumed, and if no feedback is given to the client in the time, the user is enabled to wait for the feedback, so that the experience of the user is reduced. In order to solve the problem, in the embodiment of the application, first feedback data, that is, cache data, is sent to a user, at this time, the cache data exists in a cache library and corresponds to a data acquisition request, but only lacks a certain timeliness, but the data is fed back to a client first for use by the user, so that waiting of the user can be avoided, and meanwhile, the first feedback data also has a certain reference value, so that a value can be brought to the user to a certain extent. In the process that the user browses the first feedback data, the server acquires cache data needing to be updated from the database to the cache library, acquires second feedback data corresponding to the data acquisition request from the cache library, and feeds the second feedback data back to the client so that the user can browse the real feedback data.
It should be noted that the time difference between the second feedback data and the first feedback data is less than or equal to the time of the first feedback data displayed on the client, so that the problem of data blank is avoided, and the user can browse the data all the time.
Further, as shown in fig. 2, a flowchart of a method for sending second feedback data provided in the embodiment of the present application is provided, where the method includes:
s211, acquiring target data from a database according to the data acquisition request, wherein the target data are data matched with the data acquisition request in the database;
s212, replacing the cache data with the target data to obtain updated cache data;
and S213, sending the updated cache data serving as second feedback data to the client.
As can be seen from the foregoing, the second feedback data is updated cache data, and the cache data is updated from the database, so the second feedback data is also data obtained from the database, and in order to match the data obtaining request, the part of the data needs to be target data obtained from the database and matching the data obtaining request. At this time, the cache data which is invalid in the cache library needs to be updated by using the acquired target data, so that the updated cache data can be sent to the client as second feedback data for the client to use the latest data, and the timeliness of the data acquired by the client is ensured.
In one implementation, if the time interval is less than or equal to the validity period of the cached data, the first feedback data is sent to the client.
At this time, it is described that the cache data in the cache library still has timeliness, and the requirement of the client for data novelty is met, so that the first feedback data, namely the cache data, can be directly sent to the client for the client to display.
In one implementation manner, as shown in fig. 3, a flowchart of a method for determining third feedback data provided in an embodiment of the present application is provided, where the method includes:
s311, if the time interval is greater than or equal to the expiration date of the cache data, sequencing preselected data in a database to obtain a first sequence, wherein the preselected data are data which accord with preset preliminary screening conditions in the database;
s312, sequentially browsing each preselected data in the first sequence in combination with the data obtaining request to determine a third feedback data, where the third feedback data is the preselected data corresponding to the data obtaining request.
If the time interval is greater than or equal to the expiration date of the cache data, it indicates that the cache data needs to be deleted, at this time, the cache data corresponding to the data acquisition request cannot be acquired from the cache library, and at this time, corresponding feedback data needs to be acquired from the database according to the data acquisition request. At the moment, all data in the database are screened, and data which are not consistent with preset preliminary screening conditions are removed, so that preselected data are obtained, the range of the data to be inquired can be effectively reduced, and the efficiency of subsequently acquiring the data is improved. These preselected data are then sorted to obtain a first sequence. At this time, according to the arrangement sequence of each preselected data in the first sequence, each preselected data is matched, so that third feedback data matched with the data acquisition request is determined, each preselected data is sequentially matched according to the first sequence, data loss can be avoided, repeated matching of data can also be avoided, and the data matching efficiency is ensured.
Further, as shown in fig. 4, a flowchart of a method for obtaining a first sequence provided in an embodiment of the present application is provided, where the method includes:
s3111, combining preset ordering influence dimensions, and calculating weights corresponding to the preset ordering influence dimensions by using an ordering algorithm;
s3112, calculating an addition and an average value of products of each subdata in each piece of preselected data and the corresponding weight to obtain a score of each piece of preselected data;
s3113, obtaining a first sequence consisting of the preselected data by arranging the scores.
The ordering of the data may be influenced by the weight, i.e. the data ordered first may correspond to the data ordered later with a higher weight. Usually, a weight corresponding to each preset ordering influence dimension, such as a distance, a cost, etc., is calculated according to the preset ordering influence dimension. Each piece of pre-selected data is usually composed of a plurality of sub-data, for example, order data, including sub-data such as distance, unit price, oil consumption, waiting time, etc., each sub-data corresponds to a weight, and the score of the pre-selected data corresponding to the sub-data can be obtained by multiplying the sub-data by the respective weight, and adding and averaging the products. The preselected data may then be ranked according to the score of each preselected data to obtain a first sequence. Therefore, the generation method of the first sequence provided by the embodiment of the application can comprehensively consider various weight factors influencing the sorting, so that the arrangement mode of each preselected data in the first sequence is adjusted by adjusting the weight value to generate the first sequence meeting the sequence of the matched data.
Further, as shown in fig. 5, a flowchart of an index generating method provided in an embodiment of the present application is shown, where the method includes:
s411, correspondingly generating an index from the first sequence, wherein the index and the first sequence have a one-to-one correspondence relationship;
s412, the index is placed into a cache, so that the data processor determines the corresponding first sequence according to the index.
Because the first sequence contains a large amount of preselected data, the data size is large, and if the first sequence is directly put into the cache library, a large amount of memory of the cache library is occupied, so that the available space of the cache library is reduced. At this time, an index is generated for the first sequence and is placed in the cache library, the corresponding first sequence can be determined through the index, and the volume of the index is greatly reduced compared with that of the first sequence, so that the occupation of the cache library can be effectively reduced. Meanwhile, the number of the first sequences can be generated according to actual requirements, and correspondingly, the cache library can simultaneously contain a plurality of indexes so as to determine different first sequences.
In one implementation manner, the expiration date of the cached data is taken as a period, and a second sequence is automatically created for an active client to use, where the active client is a client that sends the data acquisition request with a frequency greater than or equal to a preset frequency.
For some active clients that may initiate data acquisition requests periodically, if operations such as updating cache data are performed after each data acquisition request is initiated, a problem of data acquisition waiting may occur, so as to affect the experience of the user. Therefore, prejudgment can be made for the active users, namely cache data related to the data acquisition requests of the active users are updated regularly, namely the second sequence is created automatically, so that when the active users initiate the data acquisition requests, the latest cache data can be searched from the second sequence of the cache library at any time. The second sequence may be the same as or different from the first sequence.
In one implementation manner, as shown in fig. 6, there is provided a flowchart of a method for requesting a wait processing according to an embodiment of the present application, where the method includes:
s511, acquiring the total number of the to-be-processed requests corresponding to the request time, wherein the to-be-processed requests are data acquisition requests sent by each client;
and S512, if the total number is greater than or equal to a preset number threshold, sending preset feedback data to each client, and processing each request to be processed one by one.
For the situation of high concurrency of data acquisition requests, once the total number of the to-be-processed requests corresponding to the request time exceeds a preset number threshold, the processing capacity of the server for the data acquisition requests is saturated, and if the processing is continued, exceptions and errors occur. At this time, it is necessary to adopt the fusing process, i.e., not to process too many data acquisition requests. However, if no data is fed back to the client, the user may have a poor experience, and at this time, preset feedback data may be sent to the client, and the feedback data does not occupy the processing resource of the server, but the user may feel feedback of the data to ensure the experience. Meanwhile, the requests to be processed, which cannot be processed at the current moment, are delayed one by one rather than discarded, so that the data acquisition requests of the client are effectively processed.
Fig. 7 is a schematic structural diagram of a first embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention, where the apparatus includes: a data acquisition request receiving module 1, configured to receive a data acquisition request sent by a client, where the data acquisition request includes a request time; a difference value calculating module 2, configured to calculate a difference value between the request time and a time at which the client first sends the data obtaining request, so as to obtain a request time interval; a feedback data sending module 3, configured to send first feedback data and second feedback data to the client if the time interval is greater than an expiration date of the cache data and smaller than an expiration date of the cache data, where the first feedback data is the cache data, the second feedback data is updated cache data, the second feedback data is sent after the first feedback data, a sending time interval between the second feedback data and the first feedback data is smaller than or equal to a display time of the first feedback data at the client, the expiration date of the cache data is a time date when the cache data is invalid, and the expiration date of the cache data is a time date when the cache data is cleared.
Fig. 8 is a schematic structural diagram of a second embodiment of a feedback apparatus for buffering data according to an embodiment of the present invention, where the feedback data sending module 3 includes: a target data obtaining module 31, configured to obtain target data from a database according to the data obtaining request, where the target data is data in the database that matches the data obtaining request; a replacing module 32, configured to replace the cache data with the target data to obtain updated cache data; and a second feedback data sending module 33, configured to send the updated cache data to the client as second feedback data.
Fig. 9 is a schematic structural diagram of a third embodiment of a feedback apparatus for cached data according to an embodiment of the present invention, where the apparatus further includes: a first feedback data sending module 4, configured to send the first feedback data to the client if the time interval is less than or equal to the validity period of the cached data.
Fig. 10 is a schematic structural diagram of a fourth embodiment of a feedback apparatus for cached data according to an embodiment of the present invention, where the apparatus further includes: the first sequencing module 5 is configured to sequence preselected data in a database to obtain a first sequence if the time interval is greater than or equal to the expiration date of the cached data, where the preselected data is data that meets a preset preliminary screening condition in the database; and a third feedback data determining module 6, configured to browse, in sequence, each preselected data in the first sequence in combination with the data acquisition request to determine third feedback data, where the third feedback data is the preselected data corresponding to the data acquisition request.
Fig. 11 is a schematic structural diagram of a fifth embodiment of a feedback apparatus for cache data according to an embodiment of the present invention, where the first sorting module 5 includes: the weight calculation module 51 is configured to calculate, by using a ranking algorithm, a weight corresponding to each preset ranking influence dimension in combination with the preset ranking influence dimension; the adding and averaging value calculating module 52 is used for calculating the adding and averaging value of the product of each subdata in each piece of preselected data and the corresponding weight to obtain the score of each piece of preselected data; a first sequence obtaining module 53, configured to obtain a first sequence composed of the preselected data by ranking the scores.
Fig. 12 is a schematic structural diagram of a sixth embodiment of a feedback apparatus for cached data according to an embodiment of the present invention, where the apparatus further includes: an index generating module 7, configured to generate an index corresponding to the first sequence, where the index and the first sequence have a one-to-one correspondence relationship; and the index cache module 8 is configured to put the index into a cache, so that the data processor determines the corresponding first sequence according to the index.
Fig. 13 is a schematic structural diagram of a seventh embodiment of a feedback apparatus for cache data according to an embodiment of the present invention, where the apparatus further includes: and a second sequence creating module 9, configured to automatically create a second sequence for an active client to use, where the expiration date of the cached data is used as a period, and the active client is a client that sends the data acquisition request and has a frequency greater than or equal to a preset frequency.
Fig. 14 is a schematic structural diagram of an eighth embodiment of a feedback apparatus for cached data according to an embodiment of the present invention, where the apparatus further includes: a request quantity obtaining module 10, configured to obtain a total quantity of to-be-processed requests corresponding to the request time, where the to-be-processed requests are data obtaining requests sent by each client; and a preset feedback data sending module 11, configured to send preset feedback data to each client if the total number is greater than or equal to a preset number threshold, and process each to-be-processed request one by one.
Fig. 15 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention. The electronic device includes: a memory 101 and a processor 102;
a memory 101 for storing a computer program;
the processor 102 is configured to execute the computer program stored in the memory to implement the feedback method of the cached data in the above embodiments. Reference may be made in particular to the description relating to the method embodiments described above.
Alternatively, the memory 101 may be separate or integrated with the processor 102.
When the memory 101 is a device independent of the processor 102, the electronic apparatus may further include:
a bus 103 for connecting the memory 101 and the processor 102.
The electronic device provided in the embodiment of the present invention may be configured to execute any one of the cache data feedback methods shown in the above embodiments, and the implementation manner and the technical effect of the electronic device are similar to each other.
An embodiment of the present invention further provides a readable storage medium, where a computer program is stored in the readable storage medium, and when at least one processor of a message sending apparatus executes the computer program, the message sending apparatus executes the feedback method for cached data according to any of the foregoing embodiments.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program described above may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (18)

1. A feedback method for buffering data, the method comprising:
receiving a data acquisition request sent by a client, wherein the data acquisition request comprises request time;
calculating the difference value between the request time and the time when the client sends the data acquisition request for the first time to obtain a request time interval;
if the time interval is greater than the validity period of the cache data and less than the expiration period of the cache data, sending first feedback data and second feedback data to the client, wherein the first feedback data is the cache data, the second feedback data is updated cache data, the second feedback data is sent after the first feedback data, the sending time interval between the second feedback data and the first feedback data is less than or equal to the display time of the first feedback data at the client, the validity period of the cache data is the time period when the cache data is invalid, and the expiration period of the cache data is the time period when the cache data is cleared.
2. The method of claim 1, wherein sending the first feedback data and the second feedback data to the client comprises:
acquiring target data from a database according to the data acquisition request, wherein the target data is data matched with the data acquisition request in the database;
replacing the cache data with the target data to obtain updated cache data;
and sending the updated cache data as second feedback data to the client.
3. The method of claim 1, further comprising:
and if the time interval is less than or equal to the validity period of the cache data, sending the first feedback data to the client.
4. The method of claim 3, further comprising:
if the time interval is greater than or equal to the expiration date of the cache data, sequencing preselected data in a database to obtain a first sequence, wherein the preselected data are data which accord with preset preliminary screening conditions in the database;
and browsing each preselected data in the first sequence in combination with the data acquisition request to determine third feedback data, wherein the third feedback data are the preselected data corresponding to the data acquisition request.
5. The method of claim 4, wherein sorting the preselected data in the database into a first sequence comprises:
calculating the weight corresponding to each preset sequencing influence dimension by utilizing a sequencing algorithm in combination with the preset sequencing influence dimension;
calculating the sum and average value of the product of each subdata in each piece of preselected data and the corresponding weight to obtain the score of each piece of preselected data;
by ranking the scores, a first sequence consisting of the preselected data is obtained.
6. The method of claim 5, after obtaining the first sequence, comprising:
correspondingly generating an index from the first sequence, wherein the index and the first sequence have a one-to-one correspondence relationship;
and putting the index into a cache for a data processor to determine the corresponding first sequence according to the index.
7. The method of claim 4, further comprising:
and automatically creating a second sequence for an active client to use by taking the expiration date of the cache data as a period, wherein the frequency of sending the data acquisition request is greater than or equal to the preset frequency.
8. The method according to any of claims 1-7, wherein said calculating a difference between the time of the request and the time of the first transmission of the data acquisition request by the client comprises:
acquiring the total number of to-be-processed requests corresponding to the request moment, wherein the to-be-processed requests are data acquisition requests sent by each client;
and if the total number is greater than or equal to a preset number threshold, sending preset feedback data to each client, and processing each request to be processed one by one.
9. A feedback apparatus for buffering data, the apparatus comprising:
the data acquisition request receiving module is used for receiving a data acquisition request sent by a client, wherein the data acquisition request comprises request time;
a difference value calculating module, configured to calculate a difference value between the request time and a time at which the client first sends the data acquisition request, so as to obtain a request time interval;
a feedback data sending module, configured to send first feedback data and second feedback data to the client if the time interval is greater than an expiration date of cache data and less than an expiration date of the cache data, where the first feedback data is the cache data, the second feedback data is updated cache data, the second feedback data is sent after the first feedback data, a sending time interval between the second feedback data and the first feedback data is less than or equal to a presentation time of the first feedback data at the client, the expiration date of the cache data is a time date when the cache data is invalid, and the expiration date of the cache data is a time date when the cache data is cleared.
10. The apparatus of claim 9, wherein the feedback data sending module comprises:
the target data acquisition module is used for acquiring target data from a database according to the data acquisition request, wherein the target data is data matched with the data acquisition request in the database;
the replacing module is used for replacing the cache data by using the target data to obtain updated cache data;
and the second feedback data sending module is used for sending the updated cache data serving as second feedback data to the client.
11. The apparatus of claim 9, further comprising:
and the first feedback data sending module is used for sending the first feedback data to the client if the time interval is less than or equal to the validity period of the cache data.
12. The apparatus of claim 11, further comprising:
the first sequencing module is used for sequencing preselected data in a database to obtain a first sequence if the time interval is greater than or equal to the expiration date of the cache data, wherein the preselected data are data which accord with preset preliminary screening conditions in the database;
and the third feedback data determining module is used for sequentially browsing each preselected data in the first sequence in combination with the data acquisition request to determine third feedback data, wherein the third feedback data are the preselected data corresponding to the data acquisition request.
13. The apparatus of claim 12, wherein the first ordering module comprises:
the weight calculation module is used for calculating the weight corresponding to each preset sequencing influence dimension by utilizing a sequencing algorithm in combination with the preset sequencing influence dimension;
the adding and averaging value calculating module is used for calculating the adding and averaging value of the product of each subdata in each piece of preselected data and the corresponding weight to obtain the score of each piece of preselected data;
and the first sequence acquisition module is used for obtaining a first sequence consisting of the preselected data by arranging the scores.
14. The apparatus of claim 13, further comprising:
the index generating module is used for correspondingly generating an index from the first sequence, and the index and the first sequence have a one-to-one correspondence relationship;
and the index cache module is used for putting the index into a cache so that the data processor determines the corresponding first sequence according to the index.
15. The apparatus of claim 12, further comprising:
and the first sequence creating module is used for automatically creating a second sequence for an active client to use by taking the expiration date of the cache data as a period, wherein the frequency of sending the data acquisition request is greater than or equal to the preset frequency.
16. The apparatus of any of claims 9-15, further comprising:
a request quantity obtaining module, configured to obtain a total quantity of to-be-processed requests corresponding to the request time, where the to-be-processed requests are data obtaining requests sent by each client;
and the preset feedback data sending module is used for sending preset feedback data to each client and processing each request to be processed one by one if the total number is greater than or equal to a preset number threshold.
17. An electronic device, characterized in that the electronic device comprises:
a processor, and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the feedback method of caching data of claims 1-8 via execution of the executable instructions.
18. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a feedback method of caching data according to one of claims 1 to 8.
CN201911366361.7A 2019-12-26 2019-12-26 Feedback method and device of cache data, electronic equipment and storage medium Pending CN111143414A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911366361.7A CN111143414A (en) 2019-12-26 2019-12-26 Feedback method and device of cache data, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911366361.7A CN111143414A (en) 2019-12-26 2019-12-26 Feedback method and device of cache data, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111143414A true CN111143414A (en) 2020-05-12

Family

ID=70520654

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911366361.7A Pending CN111143414A (en) 2019-12-26 2019-12-26 Feedback method and device of cache data, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111143414A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199898A (en) * 2020-11-11 2021-01-08 北京三维天地科技股份有限公司 Instrument and equipment fault prediction and health management algorithm based on big data

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1940922A (en) * 2005-09-30 2007-04-04 腾讯科技(深圳)有限公司 Method and system for improving information search speed
US20080147974A1 (en) * 2006-12-18 2008-06-19 Yahoo! Inc. Multi-level caching system
CN104794249A (en) * 2015-05-15 2015-07-22 乐得科技有限公司 Realization method and realization device of database
CN106484633A (en) * 2016-10-08 2017-03-08 广州华多网络科技有限公司 A kind of data cached method and device
CN106708568A (en) * 2016-12-07 2017-05-24 微梦创科网络科技(中国)有限公司 Method and apparatus for paged loading of client contents
CN108228657A (en) * 2016-12-22 2018-06-29 沈阳美行科技有限公司 The implementation method and device of a kind of key search
CN108268476A (en) * 2016-12-30 2018-07-10 北京国双科技有限公司 Data query method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1940922A (en) * 2005-09-30 2007-04-04 腾讯科技(深圳)有限公司 Method and system for improving information search speed
US20080147974A1 (en) * 2006-12-18 2008-06-19 Yahoo! Inc. Multi-level caching system
CN104794249A (en) * 2015-05-15 2015-07-22 乐得科技有限公司 Realization method and realization device of database
CN106484633A (en) * 2016-10-08 2017-03-08 广州华多网络科技有限公司 A kind of data cached method and device
CN106708568A (en) * 2016-12-07 2017-05-24 微梦创科网络科技(中国)有限公司 Method and apparatus for paged loading of client contents
CN108228657A (en) * 2016-12-22 2018-06-29 沈阳美行科技有限公司 The implementation method and device of a kind of key search
CN108268476A (en) * 2016-12-30 2018-07-10 北京国双科技有限公司 Data query method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112199898A (en) * 2020-11-11 2021-01-08 北京三维天地科技股份有限公司 Instrument and equipment fault prediction and health management algorithm based on big data
CN112199898B (en) * 2020-11-11 2021-06-15 北京三维天地科技股份有限公司 Instrument and equipment fault prediction and health management method based on big data

Similar Documents

Publication Publication Date Title
US11481657B2 (en) Content recommendation method, apparatus and system
KR102358604B1 (en) Convergence data processing method and information recommendation system
US11526570B2 (en) Page-based prediction of user intent
US10528559B2 (en) Information processing system, terminal, server, information processing method, recording medium, and program
US20200293589A1 (en) Optimizing listing efficiency and efficacy for a delivery coordination system
CN112000871A (en) Method, device and equipment for determining search result list and storage medium
CN106254417A (en) Data cache method, Apparatus and system
CN107391764B (en) Service data query method
CN108200127A (en) Data transmission method for uplink, device, server, terminal and storage medium
CN111143414A (en) Feedback method and device of cache data, electronic equipment and storage medium
CN113923529A (en) Live broadcast wheat connecting method, device, equipment and storage medium
US20120030021A1 (en) Selecting advertisements using same session queries
JP6111557B2 (en) Information provision system
CN110570271A (en) information recommendation method and device, electronic equipment and readable storage medium
US20150371236A1 (en) Issue response and prediction
CN107910044A (en) object push method and system
CN109597941B (en) Sorting method and device, electronic equipment and storage medium
CN107580038B (en) Expert recommendation method and system
CN115985448A (en) Method, device and equipment for determining medication data and distributing
CN110990701A (en) Book searching method, computing device and computer storage medium
CN111026956B (en) Data list processing method and device, electronic equipment and computer storage medium
CN114417083A (en) Material processing method, computing device and storage medium
CN115203591A (en) Point of interest data display method and device, storage medium and computer equipment
CN108182237B (en) Big data display method, system and related device
CN106910126B (en) Method and device for generating trend chart

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination