CN117009693A - http request front-end caching method and device, electronic equipment and readable medium - Google Patents

http request front-end caching method and device, electronic equipment and readable medium Download PDF

Info

Publication number
CN117009693A
CN117009693A CN202310935172.7A CN202310935172A CN117009693A CN 117009693 A CN117009693 A CN 117009693A CN 202310935172 A CN202310935172 A CN 202310935172A CN 117009693 A CN117009693 A CN 117009693A
Authority
CN
China
Prior art keywords
cache
http request
request
http
determining whether
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310935172.7A
Other languages
Chinese (zh)
Inventor
吴瑞
唐勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inspur Communication Information System Co Ltd
Original Assignee
Inspur Communication Information System Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inspur Communication Information System Co Ltd filed Critical Inspur Communication Information System Co Ltd
Priority to CN202310935172.7A priority Critical patent/CN117009693A/en
Publication of CN117009693A publication Critical patent/CN117009693A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/50Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

The invention provides an http request front-end caching method, an http request front-end caching device, electronic equipment and a readable medium. The method comprises the following steps: intercepting an http request; determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache; when the cache is hit, reading cache data, and returning the cache data as a result of the http request; and sending the http request in a silent mode, and updating the result of the http request in a silent mode into a cache when the result is normal. The scheme of the invention solves the problems that the latency time of the http request cache of the traditional browser is longer, or the interface is not cached, or page data cannot be updated in time due to cache, and improves the system use experience of users.

Description

http request front-end caching method and device, electronic equipment and readable medium
Technical Field
The present invention relates to the field of cache processing, and in particular, to a method and apparatus for front-end caching of an http request, an electronic device, and a readable medium.
Background
In conventional front-end development, the cache types of http requests are divided into: forced caching, comparison caching and offline caching.
Due to the defects of different aspects of the three cache types, the problem that the latency time of the http request cache of the traditional browser is longer than that of the cache interface, or page data cannot be updated in time due to the cache is solved, and the system use experience of a user is affected.
Therefore, a method is needed to solve the problems of page stuck and untimely update of data, which may be caused by the browser caching scheme when the interface is long.
Disclosure of Invention
The embodiment of the invention provides a method, a device, electronic equipment and a readable medium for caching the front end of an http request, which are used for solving the problems that page blocking and data untimely updating are possibly caused when the interface time consumption of a browser caching scheme is long.
According to an aspect of the present invention, there is provided an http request front-end caching method, including:
intercepting an http request;
determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache;
when the cache is hit, reading cache data, and returning the cache data as a result of the http request;
and sending the http request in a silent mode, and updating the result of the http request in a silent mode into a cache when the result is normal.
Optionally, intercepting the http request includes:
an Axios instance is created by axios.create (), and a request interceptor is added using the interseptiors.request.use () method.
Optionally, the determining whether the http request needs to process buffering includes: and determining whether the request parameters of the http request contain cache=true or whether the url address of the http request is in a request list needing to be cached.
Optionally, the determining whether to hit in the cache includes:
generating a request unique feature code according to the url address and the request parameter of the http request;
searching in the cache according to the unique feature code, and determining the cache to be missed when the cache is not found or the cache is not within the validity period, otherwise, determining the cache to be hit.
Optionally, the generating a request unique feature code according to the url address and the request parameter of the http request includes:
acquiring url addresses and request parameters of the http;
converting the request parameter usage into a string;
and calculating hash values of the url address and the request parameters by using a hash algorithm MD5 to obtain the unique feature code.
Optionally, the cache is not in the validity period, and is determined by the following method:
and determining whether the difference value between the storage time of the cache and the current time exceeds a preset threshold value, and if so, not storing the cache in the validity period.
Optionally, the sending the http request by silence includes:
the http request is made using a window.
According to another aspect of the present invention, there is provided an http request front-end buffering apparatus, including:
the intercepting unit is used for intercepting the http request;
the determining unit is used for determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache;
the reading unit is used for reading the cache data when the cache is hit, and returning the cache data as the result of the http request;
and the sending unit is used for silently sending the http request, and silently updating the result of the http request into the cache when the result is normal.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores a computer program executable by the at least one processor, so that the at least one processor can execute the http request front-end caching method according to any embodiment of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the http request front-end caching method according to any one of the embodiments of the present invention when executed.
The embodiment of the invention provides a method, a device, electronic equipment and a readable medium for caching the front end of an http request, which intercept the http request; determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache; when the cache is hit, reading cache data, and returning the cache data as a result of the http request; and sending the http request in a silent mode, and updating the result of the http request in a silent mode into a cache when the result is normal. The scheme of the invention solves the problems that the latency time of the http request cache of the traditional browser is longer, or the interface is not cached, or page data cannot be updated in time due to cache, and improves the system use experience of users.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for caching front-end http requests according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an http request front-end buffering device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an electronic device implementing an http request front-end caching method according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention.
In conventional front-end development, the cache types of http requests are divided into: forced caching, comparison caching and offline caching.
Forced caching: the server controls the caching behavior of the resource by responding to the Cache-Control and expire fields of the header. If the resource is within the client's cache validity period, the client may directly obtain the resource from the cache without sending a request to the server.
Contrast buffer: when the client sends a request, the server returns the Last-Modified or ETag field of the response header to the client as the identification of the resource. The client checks whether the resource has changed in a subsequent request by sending If-Modified-nonce (using Last-Modified value) or If-None-Match (using ETag value) fields. If the resource is unchanged, the server returns a 304Not Modified response, and the client acquires the resource from the cache.
Offline caching: the offline caching is realized through a Service workbench technology, and can enable the webpage to continue to access and provide cached resources in an offline state. The Service Worker may intercept the network request and obtain a response from the cache, or send the request to the server when there are no resources in the cache.
Forced caching has the disadvantage: the client cannot acquire updated resources in real time: due to the expiration date of the cache, the client may not be able to timely acquire the updated resources on the server. Even if the resources on the server have been updated, the client will still use the old version of the resources in the cache.
Problem of cache failure: in some cases, the server may require the client to reacquire the resource even if the resource is within the cache lifetime. This may result in unnecessary network requests and delays.
Disadvantages of contrast buffering: additional network requests: the contrast cache requires the client to send additional requests to the server to check if the resource has been modified. This increases network traffic and delay and places a certain load on the server; server resource consumption: the server needs to generate and compare identifiers (e.g., etags) of the resources, which may occupy computing resources of the server.
Offline caching has the disadvantage: cache coherence problem: the offline cache intercepts and processes network requests using Service Worker, but if resources on the server change, the offline cache may not be updated in time. This may result in the offline cached resources not being consistent with the resources on the actual server; the use conditions are as follows: the https protocol must be used to secure and not support older browsers such as IEs.
Fig. 1 is a flowchart of an http request front-end buffering method according to an embodiment of the present invention, as shown in fig. 1, where the method includes:
s110, intercepting the http request.
In an embodiment of the present invention, intercepting an http request includes:
an Axios instance is created by axios.create (), and a request interceptor is added using the interseptiors.request.use () method.
In the request interceptor, custom logic processing may be performed.
S120, determining whether the http request needs to be cached, if so, generating a unique request feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache.
In an embodiment of the present invention, the determining whether the http request needs to process a cache includes: and determining whether the request parameters of the http request contain cache=true or whether the url address of the http request is in a request list needing to be cached.
In addition, if the cache is not required to be processed, the http request is normally sent, the request result data is returned, and all steps are finished.
In an embodiment of the present invention, the determining whether to hit in the cache includes:
generating a request unique feature code according to the url address and the request parameter of the http request;
searching in the cache according to the unique feature code, and determining the cache to be missed when the cache is not found or the cache is not within the validity period, otherwise, determining the cache to be hit.
Generating a unique request feature code key according to the url address and request parameters of the http request, searching in a cache according to the key, and normally sending the http request for missing cache if the cache is not found or not in the valid period, and returning request result data, wherein all the steps are finished.
Specifically, the request address url and the request parameter params of http are obtained, the params are converted into character strings by using json. Stringing (params), and the hash value of url+params is calculated by using a hash algorithm MD5, namely, the unique request feature code.
The implementation of reading the buffered data is as follows: the data in the cache is read using the getItem (key) method of localForage.
In the embodiment of the invention, the cache is not in the valid period, and is determined by the following steps:
and determining whether the difference value between the storage time of the cache and the current time exceeds a preset threshold value, and if so, not storing the cache in the validity period.
And S130, when the cache is hit, reading the cache data, and returning the cache data as the result of the http request.
S140, the http request is sent in a silent mode, and when the result of the http request is normal, the result is updated in a buffer in a silent mode.
In the embodiment of the present invention, the sending the http request by silence includes: the http request is made using a window.
And sending the http request in a silent mode, namely performing a background http request in idle time of the program. The specific implementation of the request in program idle time is as follows:
http requests were made using a window. requestidleCallback () is a browser-provided method for executing callback functions when the browser is idle, which can be used to perform tasks that are not performance-intensive to take full advantage of the browser's idle time.
In addition, if the request result is normal, the result is silently updated into the cache; the exception of the request result is not specially processed. The method for updating the result into the cache is as follows: the result is updated into the cache using the setItem (key) method of localForage.
According to the scheme, through the lazy loading technology, only resources required by a user for opening the page are loaded, unnecessary network requests and bandwidth consumption are reduced, server load and user traffic consumption are reduced, and the overall performance of the website is improved. Meanwhile, through silent updating, a user can acquire the latest version of application content without manual operation or perception updating process, and the availability and timeliness of the application are improved.
The invention mainly aims to solve the problem of page jam caused by long interface time consumption, such as that a user opens a certain form page, the page is rendered from time to time by the result of a model interface, and if the interface waiting time is slightly long, the page is rendered to the user to feel that the page is jammed. The invention solves the problems that the http request cache of the traditional browser has long waiting time without caching an interface or the page data cannot be updated in time due to the cache, and improves the system use experience of users.
Fig. 2 is a schematic structural diagram of an http request front-end buffering device according to an embodiment of the present invention, as shown in fig. 2, the device includes:
an interception unit 210, configured to intercept the http request;
a determining unit 220, configured to determine whether the http request needs to be cached, and if so, generate a unique request feature code key according to a url address of the http request and a request parameter, and find the unique request feature code key in the cache according to the key, to determine whether the http request hits the cache;
a reading unit 230, configured to read the cache data when the cache is hit, and return the cache data as a result of the http request;
and the sending unit 240 is configured to silence and send the http request, and update the result of the http request to the cache when the result is normal.
It will be appreciated that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the http request front-end buffering device. In other embodiments of the invention, the http request front-end buffering means may comprise more or less components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The content of information interaction and execution process between the units in the device is based on the same conception as the embodiment of the method of the present invention, and specific content can be referred to the description in the embodiment of the method of the present invention, which is not repeated here.
Fig. 3 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 3, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as the http request front-end caching method.
In some embodiments, the http request front-end caching method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the http request front-end caching method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the http request front-end caching method in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be noted that not all the steps and modules in the above flowcharts and the system configuration diagrams are necessary, and some steps or modules may be omitted according to actual needs. The execution sequence of the steps is not fixed and can be adjusted as required. The system structure described in the above embodiments may be a physical structure or a logical structure, that is, some modules may be implemented by the same physical entity, or some modules may be implemented by multiple physical entities, or may be implemented jointly by some components in multiple independent devices.
In the above embodiments, the hardware unit may be mechanically or electrically implemented. For example, a hardware unit may include permanently dedicated circuitry or logic (e.g., a dedicated processor, FPGA, or ASIC) to perform the corresponding operations. The hardware unit may also include programmable logic or circuitry (e.g., a general-purpose processor or other programmable processor) that may be temporarily configured by software to perform the corresponding operations. The particular implementation (mechanical, or dedicated permanent, or temporarily set) may be determined based on cost and time considerations.
While the invention has been illustrated and described in detail in the drawings and in the preferred embodiments, the invention is not limited to the disclosed embodiments, and those skilled in the art will appreciate that many more embodiments of the invention can be obtained by combining the code audits in the different embodiments and still fall within the scope of the invention.

Claims (10)

  1. The http request front-end caching method is characterized by comprising the following steps:
    intercepting an http request;
    determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache;
    when the cache is hit, reading cache data, and returning the cache data as a result of the http request;
    and sending the http request in a silent mode, and updating the result of the http request in a silent mode into a cache when the result is normal.
  2. 2. The method according to claim 1, wherein intercepting the http request comprises:
    an Axios instance is created by axios.create (), and a request interceptor is added using the interseptiors.request.use () method.
  3. 3. The method of claim 1, wherein the determining whether the http request requires processing of a cache comprises: and determining whether the request parameters of the http request contain cache=true or whether the url address of the http request is in a request list needing to be cached.
  4. 4. The method of claim 1, wherein the determining whether to hit the cache comprises:
    generating a request unique feature code according to the url address and the request parameter of the http request;
    searching in the cache according to the unique feature code, and determining the cache to be missed when the cache is not found or the cache is not within the validity period, otherwise, determining the cache to be hit.
  5. 5. The method of claim 4, wherein the generating a request unique feature code from the url address and request parameters of the http request comprises:
    acquiring url addresses and request parameters of the http;
    converting the request parameter usage into a string;
    and calculating hash values of the url address and the request parameters by using a hash algorithm MD5 to obtain the unique feature code.
  6. 6. The method of claim 4, wherein the cache is not within a validity period, as determined by:
    and determining whether the difference value between the storage time of the cache and the current time exceeds a preset threshold value, and if so, not storing the cache in the validity period.
  7. 7. The method of claim 1, wherein the sending the http request by the muting comprises: the http request is made using a window.
  8. Http request front-end caching apparatus, comprising:
    the intercepting unit is used for intercepting the http request;
    the determining unit is used for determining whether the http request needs to be cached, if so, generating a request unique feature code key according to the url address of the http request and the request parameter, searching in the cache according to the key, and determining whether the http request hits the cache;
    the reading unit is used for reading the cache data when the cache is hit, and returning the cache data as the result of the http request;
    and the sending unit is used for silently sending the http request, and silently updating the result of the http request into the cache when the result is normal.
  9. 9. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores a computer program executable by the at least one processor, so that the at least one processor can execute the http request front-end caching method according to any embodiment of the present invention.
  10. 10. A computer readable storage medium, where computer instructions are stored, where the computer instructions are configured to cause a processor to implement the http request front-end caching method according to any one of the embodiments of the present invention when executed.
CN202310935172.7A 2023-07-28 2023-07-28 http request front-end caching method and device, electronic equipment and readable medium Pending CN117009693A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310935172.7A CN117009693A (en) 2023-07-28 2023-07-28 http request front-end caching method and device, electronic equipment and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310935172.7A CN117009693A (en) 2023-07-28 2023-07-28 http request front-end caching method and device, electronic equipment and readable medium

Publications (1)

Publication Number Publication Date
CN117009693A true CN117009693A (en) 2023-11-07

Family

ID=88568429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310935172.7A Pending CN117009693A (en) 2023-07-28 2023-07-28 http request front-end caching method and device, electronic equipment and readable medium

Country Status (1)

Country Link
CN (1) CN117009693A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117914942A (en) * 2024-03-20 2024-04-19 广东银基信息安全技术有限公司 Data request caching method and device, intelligent terminal and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117914942A (en) * 2024-03-20 2024-04-19 广东银基信息安全技术有限公司 Data request caching method and device, intelligent terminal and storage medium

Similar Documents

Publication Publication Date Title
US9009253B2 (en) Optimizing server resources using multiple retry for high traffic websites
CN108804447B (en) Method and system for responding to data request by using cache
CN106331212B (en) A kind of domain name analytic method and system resident based on DNS cache
CN112000394B (en) Method, apparatus, device and storage medium for accessing applet
CN110661826B (en) Method for processing network request by proxy server side and proxy server
CN117009693A (en) http request front-end caching method and device, electronic equipment and readable medium
US20220417154A1 (en) Data transmission method and apparatus, device, storage medium, and computer program product
CN111865970A (en) Method and apparatus for implementing interface idempotency
CN109918191B (en) Method and device for preventing frequency of service request
CN111756847B (en) Method and device for supporting https protocol by website
CN111225010A (en) Data processing method, data processing system and device
CN114138397B (en) Page display method and device, electronic equipment and storage medium
CN116028530A (en) Object resource reading method and device, electronic equipment and readable storage medium
CN114827159A (en) Network request path optimization method, device, equipment and storage medium
CN114880115A (en) Data source returning scheduling method and device, electronic equipment and storage medium
CN113220981A (en) Method and device for optimizing cache
CN113765972A (en) Data request response method, device, system, server and storage medium
CN113778909B (en) Method and device for caching data
CN115333858B (en) Login page cracking method, device, equipment and storage medium
CN116996481B (en) Live broadcast data acquisition method and device, electronic equipment and storage medium
CN113849255B (en) Data processing method, device and storage medium
CN116578800A (en) Page rendering method and device, electronic equipment and computer readable storage medium
CN111769965B (en) Information processing method, device and equipment
US20240089339A1 (en) Caching across multiple cloud environments
CN114461950A (en) Global caching method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination