CN112468583A - Information processing method and terminal of API gateway - Google Patents

Information processing method and terminal of API gateway Download PDF

Info

Publication number
CN112468583A
CN112468583A CN202011348685.0A CN202011348685A CN112468583A CN 112468583 A CN112468583 A CN 112468583A CN 202011348685 A CN202011348685 A CN 202011348685A CN 112468583 A CN112468583 A CN 112468583A
Authority
CN
China
Prior art keywords
information
cache
external
data
request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011348685.0A
Other languages
Chinese (zh)
Other versions
CN112468583B (en
Inventor
刘德建
黄正墙
郭玉湖
陈宏�
罗陈珑
吴仁海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Tianquan Educational Technology Ltd
Original Assignee
Fujian Tianquan Educational Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Tianquan Educational Technology Ltd filed Critical Fujian Tianquan Educational Technology Ltd
Priority to CN202011348685.0A priority Critical patent/CN112468583B/en
Publication of CN112468583A publication Critical patent/CN112468583A/en
Application granted granted Critical
Publication of CN112468583B publication Critical patent/CN112468583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching
    • H04L67/5682Policies or rules for updating, deleting or replacing the stored data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/66Arrangements for connecting between networks having differing types of switching systems, e.g. gateways

Abstract

The invention discloses an information processing method and a terminal of an API gateway; the method comprises the steps of receiving external information, and obtaining a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information; according to a first cache storage strategy, first characteristic data in external information is obtained, and the first characteristic data is matched with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium; performing corresponding reading or writing response information operation on the first cache medium according to the external information; the invention improves the performance of forwarding the route request by the API gateway reverse proxy, and solves the problems caused by only using a single cache medium, such as overhigh memory occupation and frequent disk reading and writing problems of memory cache or disk cache.

Description

Information processing method and terminal of API gateway
Technical Field
The invention relates to the technical field of computers, in particular to an information processing method and a terminal of an API gateway.
Background
The API gateway plays an important role in forwarding service traffic. In order to improve the performance of traffic forwarding, the gateway can cache the responded entity content in a disk of a server where the gateway is located based on the requested relevant information as a cached key, or store the returned entity content in a memory by using a cache plug-in. After the cache is opened, the requests are all preferentially acquired from the cache and are not forwarded to downstream services, so that the response speed is improved, and the pressure of a gateway and the downstream services is reduced.
The current technical solution can only use one strategy: the returned message is directly stored in the cache under the condition that the cache is allowed, and different cache storage modes cannot be judged and selected according to other strategies.
The problem that only one strategy can be adopted to select a cache storage medium is obvious, the current main scheme is to store the small files into a disk, if the disk cache stores too many small files, the read-write performance of the disk is influenced, and for frequent requests with the returned result of the small files, the effect of opening the disk cache is even inferior to that of the unopened state.
Another solution is to store the data directly into a memory cache, but because the memory is limited and the size of the memory is much smaller than the size of the disk. With the content of the cache increasing, the memory occupation will increase, thereby influencing the routing forwarding of the core capability of the API gateway as the gateway.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the information processing method and the terminal of the API gateway can select different cache strategies according to the data characteristics so as to improve the gateway performance.
In order to solve the technical problems, the invention adopts the technical scheme that:
an information processing method of an API gateway comprises the following steps:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information. An information processing terminal of an API gateway, comprising a processor, a memory, and a computer program stored on the memory and operable on the processor, the processor implementing the following steps when executing the computer program:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information.
The invention has the beneficial effects that: the invention selects different caching strategies according to different interfaces, stores the returned response data into a proper API gateway caching medium according to the configuration of the caching strategies and the characteristic data of the response data, and acquires response entity data from the corresponding caching medium according to different caching strategies and the characteristic data in the request through the gateway proxy, thereby improving the performance of the API gateway reverse proxy for forwarding the routing request. Meanwhile, the problems caused by only using a single cache medium, such as the problems of overhigh memory occupation and frequent disk reading and writing of the memory cache or the disk cache, are solved.
Drawings
Fig. 1 is a flowchart of an information processing method of an API gateway according to an embodiment of the present invention;
fig. 2 is a structural diagram of an information processing terminal of an API gateway according to an embodiment of the present invention;
fig. 3 is a detailed flowchart of an information processing method of an API gateway according to an embodiment of the present invention;
description of reference numerals:
1. an information processing terminal of an API gateway; 2. a processor; 3. a memory.
Detailed Description
In order to explain technical contents, achieved objects, and effects of the present invention in detail, the following description is made with reference to the accompanying drawings in combination with the embodiments.
Referring to fig. 1 and fig. 3, an information processing method of an API gateway includes:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information.
From the above description, the beneficial effects of the present invention are: the invention selects different caching strategies according to different interfaces, stores the returned response data into a proper API gateway caching medium according to the configuration of the caching strategies and the characteristic data of the response data, and acquires response entity data from the corresponding caching medium according to different caching strategies and the characteristic data in the request through the gateway proxy, thereby improving the performance of the API gateway reverse proxy for forwarding the routing request. Meanwhile, the problems caused by only using a single cache medium, such as the problems of overhigh memory occupation and frequent disk reading and writing of the memory cache or the disk cache, are solved.
Further, the first feature data of the external information in the step S2 includes a request parameter, a request domain name, a request URL, a return information header parameter, return data, and a return data size.
From the above description, the external information has various feature data to adapt to diversified storage strategies, and better adapt to storage requirements.
Further, if the external information is external request information, the step S3 specifically includes:
judging whether the first cache medium has first response data corresponding to the external request information, if so, returning the first response data to the outside, otherwise, sending the external request information to a first server corresponding to a gateway to acquire the first response data, and storing the first response data into the first cache medium;
if the external information is external response information, the step S3 specifically includes:
and writing the external response information into the first cache medium.
As can be seen from the above description, different processing is performed according to different types of external information and corresponding cache storage policies, and when required data cannot be acquired according to the cache storage policies, a corresponding server submits an external request in time.
Further, when the external information is external request information and the first characteristic data corresponding to the first cache storage policy is a returned data size, the process proceeds to step S4, where step S4 specifically includes:
s4, reading corresponding first response data from the corresponding cache media in sequence according to the preset cache media reading priority, if the first response data are successfully read, returning the first response data to the outside, and otherwise, sending the external request information to a first server corresponding to the gateway to obtain the first response data and storing the first response data into the first cache media.
As can be seen from the above description, when the first feature data corresponding to the cache storage policy is the size of the returned data, the size of the returned data corresponding to the request cannot be known, so that the response data is read according to the preset cache medium reading priority.
Further, the step S1 is specifically:
receiving external information, acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies according to an interface type of the external information or a preset interface regular matching rule, and if the external information is external request information and first characteristic data corresponding to the first cache storage strategy is the request URL, executing step S21;
if the external information is external response information and the first feature data corresponding to the first cache storage policy is the size of the returned data, performing step S22;
the step S2 specifically includes:
s21, matching the request URL of the external information with a cache medium matching rule in the first cache storage strategy to obtain a first cache medium corresponding to the request URL, and entering the step S3;
s22, according to the cache medium matching rule in the first cache storage strategy, comparing the size of the returned data with a preset size threshold value in the first cache storage strategy, if the size of the returned data is larger than the preset size threshold value, selecting a corresponding first cache medium, otherwise, selecting a corresponding second cache medium, and executing the step S3.
As can be seen from the above description, the corresponding first cache storage policy may be obtained according to the interface type of the interface of the external information or a preset interface regular matching rule, and the specific embodiment of the present invention is to use the request URL as the feature data and the size of the returned data as the first feature data.
Referring to fig. 2, an information processing terminal of an API gateway includes a processor, a memory, and a computer program stored in the memory and operable on the processor, where the processor executes the computer program to implement the following steps:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information.
From the above description, the beneficial effects of the present invention are: the invention selects different caching strategies according to different interfaces, stores the returned response data into a proper API gateway caching medium according to the configuration of the caching strategies and the characteristic data of the response data, and acquires response entity data from the corresponding caching medium according to different caching strategies and the characteristic data in the request through the gateway proxy, thereby improving the performance of the API gateway reverse proxy for forwarding the routing request. Meanwhile, the problems caused by only using a single cache medium, such as the problems of overhigh memory occupation and frequent disk reading and writing of the memory cache or the disk cache, are solved.
Further, the first feature data of the external information in the step S2 includes a request parameter, a request domain name, a request URL, a return information header parameter, return data, and a return data size.
From the above description, the external information has various feature data to adapt to diversified storage strategies, and better adapt to storage requirements.
Further, if the external information is external request information, the step S3 specifically includes:
judging whether the first cache medium has first response data corresponding to the external request information, if so, returning the first response data to the outside, otherwise, sending the external request information to a first server corresponding to a gateway to acquire the first response data, and storing the first response data into the first cache medium;
if the external information is external response information, the step S3 specifically includes:
and writing the external response information into the first cache medium.
As can be seen from the above description, different processing is performed according to different types of external information and corresponding cache storage policies, and when required data cannot be acquired according to the cache storage policies, a corresponding server submits an external request in time.
Further, when the external information is external request information and the first characteristic data corresponding to the first cache storage policy is a returned data size, the process proceeds to step S4, where step S4 specifically includes:
s4, reading corresponding first response data from the corresponding cache media in sequence according to the preset cache media reading priority, if the first response data are successfully read, returning the first response data to the outside, and otherwise, sending the external request information to a first server corresponding to the gateway to obtain the first response data and storing the first response data into the first cache media.
As can be seen from the above description, when the first feature data corresponding to the cache storage policy is the size of the returned data, the size of the returned data corresponding to the request cannot be known, so that the response data is read according to the preset cache medium reading priority.
Further, the step S1 is specifically:
receiving external information, acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies according to an interface type of the external information or a preset interface regular matching rule, and if the external information is external request information and first characteristic data corresponding to the first cache storage strategy is the request URL, executing step S21;
if the external information is external response information and the first feature data corresponding to the first cache storage policy is the size of the returned data, performing step S22;
the step S2 specifically includes:
s21, matching the request URL of the external information with a cache medium matching rule in the first cache storage strategy to obtain a first cache medium corresponding to the request URL, and entering the step S3;
s22, according to the cache medium matching rule in the first cache storage strategy, comparing the size of the returned data with a preset size threshold value in the first cache storage strategy, if the size of the returned data is larger than the preset size threshold value, selecting a corresponding first cache medium, otherwise, selecting a corresponding second cache medium, and executing the step S3.
As can be seen from the above description, the corresponding first cache storage policy may be obtained according to the interface type of the interface of the external information or a preset interface regular matching rule, and the specific embodiment of the present invention is to use the request URL as the feature data and the size of the returned data as the first feature data.
Referring to fig. 1 and fig. 3, a first embodiment of the present invention is:
an information processing method of an API gateway comprises the following steps:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
the first characteristic data comprises a request parameter, a request domain name, a request URL, a return information header parameter, return data and a return data size.
In this embodiment, the step S1 specifically includes:
receiving external information, acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies according to an interface type of the external information or a preset interface regular matching rule, and if the external information is external request information and first characteristic data corresponding to the first cache storage strategy is the request URL, executing step S21;
if the external information is external response information and the first feature data corresponding to the first cache storage policy is the size of the returned data, performing step S22;
the step S2 specifically includes:
s21, matching the request URL of the external information with a cache medium matching rule in the first cache storage strategy to obtain a first cache medium corresponding to the request URL, and entering the step S3;
s22, according to the cache medium matching rule in the first cache storage strategy, comparing the size of the returned data with a preset size threshold value in the first cache storage strategy, if the size of the returned data is larger than the preset size threshold value, selecting a corresponding first cache medium, otherwise, selecting a corresponding second cache medium, and executing the step S3.
In this embodiment, in order to enable the interface to provide better performance, a micro-service developer opens the gateway cache and configures a custom cache storage policy of the gateway cache. Taking the example of selecting different cache storage policies of the cache media according to the size of the returned data, when a developer configures the cache storage policies, the developer sets the feature data as the size of the returned data, sets a specific threshold value, and sets a cache media matching rule: the response data with the returned data size smaller than the threshold is stored in the memory cache medium, the response data with the returned data size larger than the threshold is stored in the disk cache medium, and the cache medium reading priority is additionally configured.
S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information;
in this embodiment, when the external response information enters the cache storage policy, the size of the returned data is determined, if the size is larger than the preset threshold, the disk cache medium is selected for caching, and otherwise, the memory cache medium is selected for caching.
If the external information is external request information, the step S3 specifically includes:
judging whether the first cache medium has first response data corresponding to the external request information, if so, returning the first response data to the outside, otherwise, sending the external request information to a first server corresponding to a gateway to acquire the first response data, and storing the first response data into the first cache medium;
if the external information is external response information, the step S3 specifically includes:
writing the external response information to the first cache medium;
when the external information is external request information and the first characteristic data corresponding to the first cache storage policy is a returned data size, the process proceeds to step S4, where step S4 specifically includes:
s4, reading corresponding first response data from the corresponding cache media in sequence according to the preset cache media reading priority, if the first response data are successfully read, returning the first response data to the outside, and otherwise, sending the external request information to a first server corresponding to the gateway to obtain the first response data and storing the first response data into the first cache media.
In this embodiment, after receiving the request, the API gateway obtains the cache storage policy corresponding to the current request interface. In this example, the feature data corresponding to the cache storage policy is the size of the returned data, and therefore, the cache is sequentially queried from the cache medium according to the cache medium reading priority configured in advance by the user.
Referring to fig. 1 and fig. 3, a second embodiment of the present invention is:
based on the first embodiment, in this embodiment, a user wants to store all caches of certain specific URLs in a cache medium of redis. The user firstly configures a cache storage strategy, the strategy is named as a redis _ flexible cache storage strategy, and the rule in the strategy is that as long as a path of a request URL carries a flexible _ store, the cache data of the external request is stored in the redis.
In addition, the user needs to reconfigure the interface type or interface regular expression which is matched with the cache storage policy named redis _ flexible. After configuration is completed, as long as a request interface of an external request conforms to the corresponding type or interface regular expression, the request interface enters a redis _ flexible cache storage strategy.
In this embodiment, after entering the redis _ flexible cache storage policy, it is further determined whether the request conforms to a cache matching rule in the redis _ flexible cache storage policy: if the path of the request URL has a flexible _ store, if the path is judged to have the flexible _ store, the result of the request is cached to a specific redis caching medium. Similarly, when the cache is obtained, whether the cache exists is preferably inquired from the corresponding cache storage medium.
Referring to fig. 2, a third embodiment of the present invention is:
an information processing terminal 1 of an API gateway comprises a processor 2, a memory 3 and a computer program stored on the memory 3 and operable on the processor 2, wherein the processor 2 implements the steps of the first or second embodiment when executing the computer program.
In summary, according to the information processing method and terminal of the API gateway provided by the present invention, different cache storage policies are selected according to different interfaces, and according to the cache storage policies and the feature data of the response data, the returned response data is stored in a suitable API gateway cache medium according to the configuration of the cache storage policies, and the request also obtains response entity data from the corresponding cache medium through the gateway proxy according to the different cache storage policies and the feature data in the request, so as to improve the performance of the API gateway reverse proxy forwarding the routing request. Meanwhile, the problems caused by only using a single cache medium, such as the problems of overhigh memory occupation and frequent disk reading and writing of the memory cache or the disk cache, are solved.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all equivalent changes made by using the contents of the present specification and the drawings, or applied directly or indirectly to the related technical fields, are included in the scope of the present invention.

Claims (10)

1. An information processing method of an API gateway is characterized by comprising the following steps:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information.
2. The information processing method of the API gateway of claim 1, wherein the first feature data in step S2 includes a request parameter, a request domain name, a request URL, a return information header parameter, a return data, and a return data size.
3. The information processing method of the API gateway according to claim 1, wherein if the external information is external request information, the step S3 specifically includes:
judging whether the first cache medium has first response data corresponding to the external request information, if so, returning the first response data to the outside, otherwise, sending the external request information to a first server corresponding to a gateway to acquire the first response data, and storing the first response data into the first cache medium;
if the external information is external response information, the step S3 specifically includes:
and writing the external response information into the first cache medium.
4. The information processing method of the API gateway as recited in claim 1, wherein when the external information is external request information and the first characteristic data corresponding to the first cache storage policy is a returned data size, the method proceeds to step S4, and step S4 specifically includes:
s4, reading corresponding first response data from the corresponding cache media in sequence according to the preset cache media reading priority, if the first response data are successfully read, returning the first response data to the outside, and otherwise, sending the external request information to a first server corresponding to the gateway to obtain the first response data and storing the first response data into the first cache media.
5. The information processing method of the API gateway according to claim 1, wherein the step S1 specifically includes:
receiving external information, acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies according to an interface type of the external information or a preset interface regular matching rule, and if the external information is external request information and first characteristic data corresponding to the first cache storage strategy is the request URL, executing step S21;
if the external information is external response information and the first feature data corresponding to the first cache storage policy is the size of the returned data, performing step S22;
the step S2 specifically includes:
s21, matching the request URL of the external information with a cache medium matching rule in the first cache storage strategy to obtain a first cache medium corresponding to the request URL, and entering the step S3;
s22, according to the cache medium matching rule in the first cache storage strategy, comparing the size of the returned data with a preset size threshold value in the first cache storage strategy, if the size of the returned data is larger than the preset size threshold value, selecting a corresponding first cache medium, otherwise, selecting a corresponding second cache medium, and executing the step S3.
6. An information processing terminal of an API gateway, comprising a processor, a memory, and a computer program stored on the memory and operable on the processor, wherein the processor implements the following steps when executing the computer program:
s1, receiving external information, and acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies, wherein the external information comprises external request information and external response information;
s2, acquiring first characteristic data in external information according to a first cache storage strategy, and matching the first characteristic data with a cache medium matching rule in the first cache storage strategy to obtain a corresponding first cache medium;
and S3, performing corresponding reading or writing response information operation on the first cache medium according to the external information.
7. The information processing terminal of an API gateway as recited in claim 6, wherein said first characteristic data of said external information in said step S2 includes a request parameter, a request domain name, a request URL, a return information header parameter, return data, and a return data size.
8. The information processing terminal of an API gateway according to claim 6, wherein if the external information is external request information, said step S3 specifically includes:
judging whether first response data corresponding to the external request information exist in the first cache medium, if so, returning the first response data to the outside, otherwise, sending the external request information to a first server corresponding to a gateway to acquire the first response data, storing the first response data into the first cache medium, and storing the first response data into the first cache medium;
if the external information is external response information, the step S3 specifically includes:
and writing the external response information into the first cache medium.
9. The information processing terminal of an API gateway as recited in claim 6, wherein when the external information is external request information and the first characteristic data corresponding to the first cache storage policy is a returned data size, the process proceeds to step S4, and step S4 specifically includes:
s4, reading corresponding first response data from the corresponding cache media in sequence according to the preset cache media reading priority, if the first response data are successfully read, returning the first response data to the outside, and otherwise, sending the external request information to a first server corresponding to the gateway to obtain the first response data and storing the first response data into the first cache media.
10. The information processing terminal of an API gateway according to claim 6, wherein said step S1 is specifically:
receiving external information, acquiring a first cache storage strategy corresponding to an interface of the external information from a plurality of preset cache storage strategies according to an interface type of the external information or a preset interface regular matching rule, and if the external information is external request information and first characteristic data corresponding to the first cache storage strategy is the request URL, executing step S21;
if the external information is external response information and the first feature data corresponding to the first cache storage policy is the size of the returned data, performing step S22;
the step S2 specifically includes:
s21, matching the request URL of the external information with a cache medium matching rule in the first cache storage strategy to obtain a first cache medium corresponding to the request URL, and entering the step S3;
s22, according to the cache medium matching rule in the first cache storage strategy, comparing the size of the returned data with a preset size threshold value in the first cache storage strategy, if the size of the returned data is larger than the preset size threshold value, selecting a corresponding first cache medium, otherwise, selecting a corresponding second cache medium, and executing the step S3.
CN202011348685.0A 2020-11-26 2020-11-26 Information processing method and terminal of API gateway Active CN112468583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011348685.0A CN112468583B (en) 2020-11-26 2020-11-26 Information processing method and terminal of API gateway

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011348685.0A CN112468583B (en) 2020-11-26 2020-11-26 Information processing method and terminal of API gateway

Publications (2)

Publication Number Publication Date
CN112468583A true CN112468583A (en) 2021-03-09
CN112468583B CN112468583B (en) 2023-09-15

Family

ID=74808577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011348685.0A Active CN112468583B (en) 2020-11-26 2020-11-26 Information processing method and terminal of API gateway

Country Status (1)

Country Link
CN (1) CN112468583B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253707A (en) * 2021-11-04 2022-03-29 华能信息技术有限公司 Micro-service request method based on API gateway

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6412045B1 (en) * 1995-05-23 2002-06-25 Lsi Logic Corporation Method for transferring data from a host computer to a storage media using selectable caching strategies
US20120221670A1 (en) * 2011-02-24 2012-08-30 Frydman Daniel Nathan Methods, circuits, devices, systems and associated computer executable code for caching content
CN107992432A (en) * 2017-11-28 2018-05-04 福建中金在线信息科技有限公司 The method and terminal device of a kind of data buffer storage
CN109324761A (en) * 2018-10-09 2019-02-12 郑州云海信息技术有限公司 A kind of data cache method, device, equipment and storage medium
CN110413543A (en) * 2019-06-17 2019-11-05 中国科学院信息工程研究所 A kind of API gateway guarantee service high availability method and system based on fusing and L2 cache
CN110851474A (en) * 2018-07-26 2020-02-28 深圳市优必选科技有限公司 Data query method, database middleware, data query device and storage medium
CN111090449A (en) * 2018-10-24 2020-05-01 北京金山云网络技术有限公司 API service access method and device and electronic equipment
CN111294372A (en) * 2018-12-07 2020-06-16 北京京东尚科信息技术有限公司 Method, device and system for realizing cache in proxy server
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6412045B1 (en) * 1995-05-23 2002-06-25 Lsi Logic Corporation Method for transferring data from a host computer to a storage media using selectable caching strategies
US20120221670A1 (en) * 2011-02-24 2012-08-30 Frydman Daniel Nathan Methods, circuits, devices, systems and associated computer executable code for caching content
CN107992432A (en) * 2017-11-28 2018-05-04 福建中金在线信息科技有限公司 The method and terminal device of a kind of data buffer storage
CN110851474A (en) * 2018-07-26 2020-02-28 深圳市优必选科技有限公司 Data query method, database middleware, data query device and storage medium
CN109324761A (en) * 2018-10-09 2019-02-12 郑州云海信息技术有限公司 A kind of data cache method, device, equipment and storage medium
CN111090449A (en) * 2018-10-24 2020-05-01 北京金山云网络技术有限公司 API service access method and device and electronic equipment
CN111294372A (en) * 2018-12-07 2020-06-16 北京京东尚科信息技术有限公司 Method, device and system for realizing cache in proxy server
CN110413543A (en) * 2019-06-17 2019-11-05 中国科学院信息工程研究所 A kind of API gateway guarantee service high availability method and system based on fusing and L2 cache
CN111930316A (en) * 2020-09-09 2020-11-13 上海七牛信息技术有限公司 Cache read-write system and method for content distribution network

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114253707A (en) * 2021-11-04 2022-03-29 华能信息技术有限公司 Micro-service request method based on API gateway
CN114253707B (en) * 2021-11-04 2024-03-12 华能信息技术有限公司 Micro-service request method based on API gateway

Also Published As

Publication number Publication date
CN112468583B (en) 2023-09-15

Similar Documents

Publication Publication Date Title
CN104331428B (en) The storage of a kind of small documents and big file and access method
US8086634B2 (en) Method and apparatus for improving file access performance of distributed storage system
CN109271355B (en) Method and device for cleaning log file cache
US7827280B2 (en) System and method for domain name filtering through the domain name system
US20140244727A1 (en) Method and apparatus for streaming multimedia content of server by using cache
US20030172172A1 (en) Method and system of performing transactions using shared resources and different applications
CN106326499B (en) A kind of data processing method and device
WO2014047193A2 (en) Mail indexing and searching using hierarchical caches
US9317470B1 (en) Method and system for incremental cache lookup and insertion
CN109766318A (en) File reading and device
CN100394404C (en) System and method for management of metadata
CN106681990A (en) Method for reading caching data under mobile cloud storage environment
TW437205B (en) An internet caching system and a method and an arrangement in such a system
CN112468583B (en) Information processing method and terminal of API gateway
CN105915619B (en) Take the cyberspace information service high-performance memory cache method of access temperature into account
CN107015978B (en) Webpage resource processing method and device
CN101576854A (en) File access method, device and system
CN110147345A (en) A kind of key assignments storage system and its working method based on RDMA
CN105208100B (en) A kind of processing method of interface data
WO2009082938A1 (en) A method, system and apparatus of affair control
JP6088853B2 (en) COMMUNICATION DEVICE, COMMUNICATION METHOD, AND COMMUNICATION PROGRAM
CN109117288B (en) Message optimization method for low-delay bypass
WO2020024709A1 (en) Method and apparatus for classifying and acquiring cache pages, and electronic device
WO2010031297A1 (en) Method of wireless application protocol (wap) gateway pull service and system thereof
CN108255898A (en) Page display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant