CN113158097A - Network access processing method, device, equipment and system - Google Patents

Network access processing method, device, equipment and system Download PDF

Info

Publication number
CN113158097A
CN113158097A CN202010012423.0A CN202010012423A CN113158097A CN 113158097 A CN113158097 A CN 113158097A CN 202010012423 A CN202010012423 A CN 202010012423A CN 113158097 A CN113158097 A CN 113158097A
Authority
CN
China
Prior art keywords
cache
key value
query
network access
access request
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010012423.0A
Other languages
Chinese (zh)
Inventor
黎海明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Tianxia Technology Co ltd
Original Assignee
Guangzhou Tianxia Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Tianxia Technology Co ltd filed Critical Guangzhou Tianxia Technology Co ltd
Priority to CN202010012423.0A priority Critical patent/CN113158097A/en
Publication of CN113158097A publication Critical patent/CN113158097A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Abstract

The disclosure relates to a network access processing method, device, equipment and system. The network access processing method comprises the following steps: obtaining a query key value of a network access request; querying in a hierarchical cache according to the query key value; and returning a query result matched with the query key value. The scheme provided by the disclosure can improve the access speed of the user and improve the user experience.

Description

Network access processing method, device, equipment and system
Technical Field
The present disclosure relates to the field of mobile internet technologies, and in particular, to a method, an apparatus, a device, and a system for processing network access.
Background
At present, more and more users can choose to rent cars for traveling when going out, so that the online car renting platform is also rapidly developed.
The online car renting platform system can provide car renting services with various requirements for users. The service provided by the system comprises various dimensional data of car renting business, such as customer source, destination, number of days of renting and the like, the data generated by adding one dimension is very large, the data is acquired from the database every time the required data is accessed and then calculated, the data pressure of the system is large, and the response time of the system is slow.
Therefore, the network access processing method in the related art cannot respond to the access request of the user in time, so that the access speed of the user is reduced, and the user experience is also reduced.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a method, an apparatus, a device, and a system for processing network access, which can improve the access speed of a user and improve the user experience.
According to a first aspect of the embodiments of the present disclosure, there is provided a network access processing method, including:
obtaining a query key value of a network access request;
querying in a hierarchical cache according to the query key value;
and returning a query result matched with the query key value.
In one embodiment, the querying in the hierarchical cache according to the query key value includes:
inquiring a matched cache key value in the Nth-level cache according to the inquiry key value;
and when the matched cache key value is not searched in the nth level cache, searching the matched cache key value in the (N + 1) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the nth level cache, and N is larger than or equal to 1.
In one embodiment, the obtaining a query key value of a network access request includes:
and acquiring a query key value determined according to a query condition in the network access request.
In one embodiment, the method further comprises:
and storing the cache in a grading way according to the network access record.
In one embodiment, the hierarchically storing the cache according to the network access record includes:
analyzing according to the network access request in the network access record to determine a cache key value;
and according to different data ranges corresponding to the cache key values, the caches of the network access records are stored to different storage positions in a grading manner, and cache IDs are set.
In an embodiment, the hierarchically storing the cache of the network access record to different storage locations according to different data ranges corresponding to the cache key value, and after setting the cache ID, further includes:
and deleting the cache with the same cache key value and the cache ID smaller than the maximum cache ID.
According to a second aspect of the embodiments of the present disclosure, there is provided a network access processing apparatus including:
the information acquisition module is used for acquiring a query key value of the network access request;
the cache query module is used for querying in the hierarchical cache according to the query key value acquired by the information acquisition module;
and the result processing module is used for returning the query result which is queried by the cache query module and is matched with the query key value.
In one embodiment, the apparatus further comprises:
and the hierarchical cache module is used for hierarchically storing the cache according to the network access record.
In one embodiment, the cache query module comprises:
the first query submodule is used for querying matched cache key values in an Nth-level cache according to the query key values;
and the second query submodule is used for querying the matched cache key value in the (N + 1) th level cache when the first query submodule does not query the matched cache key value in the (N) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the (N + 1) th level cache, and N is larger than or equal to 1.
According to a third aspect of the embodiments of the present disclosure, there is provided a server apparatus including:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method as described above.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a network access processing system including:
the client device is used for sending a network access request to the server device;
the server equipment is used for acquiring a query key value of a network access request of the client equipment; querying in a hierarchical cache according to the query key value; and returning a query result matched with the query key value.
According to a fifth aspect of embodiments of the present disclosure, there is provided a non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method as described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
according to the technical scheme provided by the embodiment of the disclosure, after a network access request is received, a query key value of the network access request is obtained, then query is carried out in a hierarchical cache according to the query key value, and a query result matched with the query key value is returned. Because the cache is used for hierarchical storage, not all access requests can obtain data from the same cache, the query content can be obtained more quickly, so that the access requests of the user can be responded in time, the access speed of the user is increased, and the user experience is improved.
Further, in the embodiment of the present disclosure, a matching cache key value is first queried in an nth level cache according to the query key value, and when the matching cache key value is not queried in the nth level cache, the matching cache key value is then queried in an N +1 th level cache, where a data range of the N +1 th level cache is larger than a data range of the nth level cache, so that a query and a matching can be performed first in a cache with a smaller data range, if the matching cache key value is queried, a query result can be directly returned, and only when the matching cache key value is not queried, the query and the matching are performed in a cache with a larger data range, so that query content can be obtained more quickly, and an access request of a user can be responded in time.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 is a flow diagram illustrating a method of network access processing according to an exemplary embodiment of the present disclosure;
fig. 2 is another schematic diagram illustrating a flow of a network access processing method according to an exemplary embodiment of the present disclosure;
fig. 3 is another schematic diagram illustrating a flow of a network access processing method according to an exemplary embodiment of the present disclosure;
fig. 4 is a schematic structural diagram illustrating a network access processing device according to an exemplary embodiment of the present disclosure;
fig. 5 is another schematic diagram illustrating a structure of a network access processing device according to an exemplary embodiment of the present disclosure;
FIG. 6 is a block diagram illustrating a network access processing system according to an exemplary embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating a computing device, according to an example embodiment of the present disclosure.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms "first," "second," "third," etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present disclosure, "a plurality" means two or more unless specifically limited otherwise.
The present disclosure provides a network access processing method, which can improve the access speed of a user and enhance the user experience.
Technical solutions of embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating a network access processing method according to an exemplary embodiment of the present disclosure.
Referring to fig. 1, the method includes:
in step 101, a query key value of a network access request is obtained.
This step may obtain a query key value determined from the query condition in the network access request.
In step 102, a query is performed in a hierarchical cache according to the query key value.
In this step, a matching cache key value may be queried in an nth level cache according to the query key value; and when the matched cache key value is not searched in the nth level cache, searching the matched cache key value in the (N + 1) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the nth level cache, and N is larger than or equal to 1.
In step 103, a query result matching the query key value is returned.
In this step, after the query result matched with the query key value is queried in the hierarchical cache, the query result matched with the query key value is returned.
It can be seen from this embodiment that, since the cache performs hierarchical storage, not all access requests are to the same cache to obtain data, the query content can be obtained faster, so that the access request of the user can be responded in time, the access speed of the user is increased, and the user experience is also improved.
Fig. 2 is another schematic diagram illustrating a flow of a network access processing method according to an exemplary embodiment of the disclosure. Fig. 2 presents the solution of the present disclosure in more detail with respect to fig. 1.
Referring to fig. 2, the method includes:
in step 201, the cache is hierarchically stored according to the network access record.
The method comprises the following steps: analyzing according to the network access request in the network access record to determine a cache key value; according to different data ranges corresponding to the cache key values, the caches of the network access records are stored to different storage positions in a grading mode, and cache IDs are set; and deleting the cache with the same cache key value and the cache ID smaller than the maximum cache ID.
In step 202, after receiving a new network access request, a query key value of the network access request is obtained.
In this step, after receiving a new network access request, a query key value determined according to a query condition in the network access request may be obtained.
In step 203, a query is performed in the hierarchical cache according to the query key value.
In this step, a matching cache key value may be queried in an nth level cache according to the query key value; and when the matched cache key value is not searched in the nth level cache, searching the matched cache key value in the (N + 1) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the nth level cache, and N is larger than or equal to 1.
In step 204, query results that match the query key value are returned.
In this step, after the query result matched with the query key value is queried in the hierarchical cache, the query result matched with the query key value is returned.
It can be seen from this embodiment that, according to the query key value, the matching cache key value is first queried in the nth level cache, and when the matching cache key value is not queried in the nth level cache, the matching cache key value is then queried in the N +1 th level cache, where the data range of the N +1 th level cache is larger than the data range of the nth level cache, so that query matching can be performed first in the cache with a smaller data range, if the matching cache key value is queried, the query result can be directly returned, and only if the matching cache key value is not queried, the query matching is performed in the cache with a larger data range, so that the query content can be obtained more quickly, and the access request of the user can be responded in time
Fig. 3 is another schematic diagram illustrating a flow of a network access processing method according to an exemplary embodiment of the disclosure. Fig. 3 presents aspects of the present disclosure in more detail with respect to fig. 1 and 2. The scheme of the present disclosure is described in fig. 3 by the interaction of the server and the user client, and takes two-level cache query as an example.
Referring to fig. 3, the method includes:
in step 301, the server obtains a network access record of the user client.
Users in various regions can send network access requests to the server through user clients to inquire various required information such as car renting information.
In step 302, the server stores the cache of the network access record in a hierarchical manner according to a preset cache hierarchical storage rule.
A cache HEAD system may be provided in the server for storing the cache. It should be noted that the cache HEAD system may also be provided independently from the server, and the present disclosure is not limited thereto.
In the step, the server analyzes and determines a cache key value according to a network access request in the network access record; and according to different data ranges corresponding to the cache key values, the caches of the network access records are stored to different storage positions in a grading mode according to preset grading rules, and cache IDs are set. In addition, the cache with the same cache key value and the cache ID smaller than the maximum cache ID can be deleted, so that the cache space is saved.
For example, if the query condition for analyzing the network access request in the network access record is that the customer source is usa, the destination is los angeles, and the provider is a, then the combination of the cache key values is customer source-destination-provider, which may form a set of cache key values: U.S. los angeles-a. According to the set of cache key values, the data range of the cache key values is only the data corresponding to the data with the source of America, the destination of los Angeles and the supplier of A, if the preset grading rule is that the data range of the individual source of the cache key values is predefined as the first-level cache, the cache of the network access record is stored in the storage position of the first-level cache, and the cache ID is set. The setting of the cache ID is generally set incrementally according to the sequence.
It should be noted that if in the combination of source-destination-provider, the source is not only an individual source such as the source in the united states, but all sources, the data range is large, and the preset classification rule is that the data range of all sources is predefined as the second-level cache, the cache of the network access record will be stored in the storage location of the second-level cache, and the cache ID is set. The classification rule preset by the present disclosure may be set according to actual needs, for example, but not limited to, classification according to a customer source range or a destination range.
In the scheme of the disclosure, the data range of the second-level cache is larger than that of the first-level cache. It should be noted that, the present disclosure is only illustrated by being divided into two levels of caches, but is not limited to this, and the present disclosure may also be divided into three levels of caches or four levels of caches, and the like, and the classification may be performed according to actual needs. That is to say, in the solution of the present disclosure, a plurality of hierarchical caches may be set, for example, the data range of the N +1 th-level cache is set to be larger than the data range of the nth-level cache, where N ≧ 1.
It should be noted that, when performing hierarchical storage, processing of adding a cache and deleting a cache may be included.
When a new cache is added, the newly set cache ID and the cache key value may be corresponded, if the customer source is a single country or the destination is a single city, the newly set cache ID and the cache key value may be correspondingly cached in the first-level cache according to a preset classification rule, and if the customer source is a general condition, for example, all the customer sources or all the destinations, the newly set cache ID and the cache key value may be correspondingly cached in the second-level cache according to a preset classification rule. In order to release the cache space so as to store more new caches, the caches can be deleted periodically or at any time, and the caches with the same cache key value and the cache ID smaller than the maximum cache ID can be deleted no matter the first-level cache or the second-level cache. The cache deletion rules are not limited by this disclosure.
In step 303, the server receives a network access request sent by a user client.
In this step, the server may receive network access requests sent by the respective user clients. For example, users in different countries may log in a Web page or an APP interface through a user client and then input query conditions to perform information acquisition, for example, to perform car rental information acquisition, and a network access request including the query conditions is sent to the server.
In step 304, the server determines a query key value according to the query condition in the network access request.
The server analyzes the query condition in the network access request of the user client, for example, analyzes the user client source, destination, supplier, and other elements, and then combines them to form a query key value, for example, if the client source is usa, the destination is los angeles, and the supplier is a, then a set of query key values can be formed: U.S. los angeles-a. It should be noted that the above combination of query key values is only illustrative and not limited thereto.
In step 305, the server determines whether a matching cache key value is queried in the first-level cache according to the query key value, if so, the step 307 is performed, and if not, the step 306 is performed.
The server judges whether the cache key value matched with the query key value is queried in the first-level cache according to the query key value, namely whether the cache key value matched with the query key value is included in the first-level cache is queried firstly. If the cache key value matching with the query key value is searched in the first-level cache, step 307 is entered, otherwise step 306 is entered.
For example, based on the query key value being "US-los Angeles-A," a query is made in the first level cache as to whether a cache key value "US-los Angeles-A" is included that matches the query key value.
That is, in this step, first, whether a cache key value with a query key value of "u.s.a. is a guest source + a los angeles + a provider" is included is searched in the first level cache.
In step 306, the server determines whether a matching cache key value is queried in the second-level cache according to the query key value, if so, step 308 is performed, and if not, step 309 is performed.
Because the cache key value matched with the query key value is not queried in the first-level cache, the server judges whether the matched cache key value is queried in the second-level cache according to the query key value. If the cache key value matching the query key value is found in the second level cache, go to step 308, otherwise go to step 309.
In step 307, a query result matching the query key value is returned according to the first level cache.
Because the cache key value matched with the query key value is searched in the first-level cache, the query result matched with the query key value is returned according to the first-level cache.
In step 308, a query result matching the query key value is returned according to the second level cache.
Because the cache key value matched with the query key value is inquired in the second-level cache, the inquiry result matched with the query key value is returned according to the second-level cache.
In step 309, query results that do not match the query key value are returned.
Because the cache key value matched with the query key value is not queried in the second-level cache, the query result which is not matched with the query key value is returned.
By applying the scheme disclosed by the invention, the hierarchical cache rule setting can be carried out according to the customer source and the supplier, and the multi-dimensional hierarchical mechanism setting can be adopted, so that the hierarchical hit of the classification from top to bottom can be realized from small to large. Because the cache is used for hierarchical storage, not all access requests can obtain data from the same cache, the query content can be obtained more quickly, so that the access requests of the user can be responded in time, the access speed of the user is increased, and the user experience is improved.
Corresponding to the embodiment of the application function implementation method, the disclosure also provides a network access processing device, terminal equipment and corresponding embodiments.
Fig. 4 is a schematic structural diagram illustrating a network access processing device according to an exemplary embodiment of the present disclosure.
Referring to fig. 4, a network access processing apparatus according to an embodiment of the present disclosure includes: an information acquisition module 41, a cache query module 42, and a result processing module 43.
An information obtaining module 41, configured to obtain a query key value of the network access request. The information obtaining module 41 may obtain a query key value determined according to a query condition in the network access request.
A cache query module 42, configured to query in a hierarchical cache according to the query key value obtained by the information obtaining module 41. The cache query module 42 may query the nth level cache for a matching cache key value according to the query key value; and when the matched cache key value is not searched in the nth level cache, searching the matched cache key value in the (N + 1) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the nth level cache, and N is larger than or equal to 1.
And the result processing module 43 is configured to return the query result that is queried by the cache query module 42 and matches the query key value.
It can be seen from this embodiment that, in the technical solution provided by the embodiment of the present disclosure, after receiving a network access request, a query key value of the network access request is obtained, then a query is performed in a hierarchical cache according to the query key value, and then a query result matching the query key value is returned. Because the cache is used for hierarchical storage, not all access requests can obtain data from the same cache, the query content can be obtained more quickly, so that the access requests of the user can be responded in time, the access speed of the user is increased, and the user experience is improved.
Fig. 5 is another schematic diagram illustrating a structure of a network access processing apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 5, a network access processing apparatus according to an embodiment of the present disclosure includes: an information acquisition module 41, a cache query module 42, a result processing module 43, and a hierarchical cache module 44.
The functions of the information obtaining module 41, the cache querying module 42 and the result processing module 43 can be referred to the description in fig. 4.
And the hierarchical cache module 44 is used for hierarchically storing the cache according to the network access record. The hierarchical caching module 44 may analyze the network access request in the network access record to determine a caching key value; and according to different data ranges corresponding to the cache key values, the caches of the network access records are stored to different storage positions in a grading manner, and cache IDs are set.
In one embodiment, the cache query module 42 may include: a first query submodule 421 and a second query submodule 422.
The first query submodule 421 is configured to query the nth level cache for a matching cache key value according to the query key value.
The second query submodule 422 is configured to query the matching cache key value in the N +1 th level cache when the first query submodule 421 does not query the matching cache key value in the N-th level cache, where a data range of the N +1 th level cache is greater than a data range of the N-th level cache, and N is greater than or equal to 1.
Taking the preset hierarchical rule as two levels as an example, when the first query submodule 421 does not query the first level cache for the matching cache key value, the second level cache queries for the matching cache key value. It should be noted that, the present disclosure is only illustrated by being divided into two levels of caches, but is not limited to this, and the present disclosure may also be divided into three levels of caches or four levels of caches, and the like, and the classification may be performed according to actual needs.
Fig. 6 is a schematic diagram illustrating a structure of a network access processing system according to an exemplary embodiment of the present disclosure.
Referring to fig. 6, a network access processing system according to an embodiment of the present disclosure includes: client device 61, server device 62.
A client device 61 for sending a network access request to the server device 62.
A server device 62 for obtaining a query key value of the network access request of the client device 61; querying in a hierarchical cache according to the query key value; and returning a query result matched with the query key value.
Wherein a more detailed structure and function of the server device 62 can be seen in the description of the network access processing means in fig. 4-5.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
FIG. 7 is a schematic diagram illustrating a computing device, according to an example embodiment of the present disclosure. The computing device may be, but is not limited to, a server device.
Referring to fig. 7, computing device 700 includes memory 710 and processor 720.
Processor 720 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 710 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Wherein the ROM may store static data or instructions that are required by processor 720 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. In addition, the memory 710 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 710 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a digital versatile disc read only (e.g., DVD-ROM, dual layer DVD-ROM), a Blu-ray disc read only, an ultra-dense disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 710 has stored thereon executable code that, when processed by the processor 720, may cause the processor 720 to perform some or all of the methods described above.
The aspects of the present disclosure have been described in detail above with reference to the accompanying drawings. In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments. Those skilled in the art should also appreciate that the acts and modules referred to in the specification are not necessarily required by the disclosure. In addition, it can be understood that steps in the method of the embodiment of the present disclosure may be sequentially adjusted, combined, and deleted according to actual needs, and modules in the device of the embodiment of the present disclosure may be combined, divided, and deleted according to actual needs.
Furthermore, the method according to the present disclosure may also be implemented as a computer program or computer program product comprising computer program code instructions for performing some or all of the steps of the above-described method of the present disclosure.
Alternatively, the present disclosure may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) that, when executed by a processor of an electronic device (or computing device, server, or the like), causes the processor to perform some or all of the various steps of the above-described method according to the present disclosure.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A network access processing method, comprising:
obtaining a query key value of a network access request;
querying in a hierarchical cache according to the query key value;
and returning a query result matched with the query key value.
2. The method of claim 1, wherein querying in a hierarchical cache according to the query key value comprises:
inquiring a matched cache key value in the Nth-level cache according to the inquiry key value;
and when the matched cache key value is not searched in the nth level cache, searching the matched cache key value in the (N + 1) th level cache, wherein the data range of the (N + 1) th level cache is larger than that of the nth level cache, and N is larger than or equal to 1.
3. The method of claim 1, wherein obtaining the query key value of the network access request comprises:
and acquiring a query key value determined according to a query condition in the network access request.
4. The method of claim 1, further comprising:
and storing the cache in a grading way according to the network access record.
5. The method of claim 4, wherein the hierarchically storing the cache according to the network access record comprises:
analyzing according to the network access request in the network access record to determine a cache key value;
and according to different data ranges corresponding to the cache key values, the caches of the network access records are stored to different storage positions in a grading manner, and cache IDs are set.
6. The method of claim 5, wherein the hierarchically storing the cache of the network access record to different storage locations according to different data ranges corresponding to the cache key value and setting a cache ID further comprises:
and deleting the cache with the same cache key value and the cache ID smaller than the maximum cache ID.
7. A network access processing apparatus, comprising:
the information acquisition module is used for acquiring a query key value of the network access request;
the cache query module is used for querying in the hierarchical cache according to the query key value acquired by the information acquisition module;
and the result processing module is used for returning the query result which is queried by the cache query module and is matched with the query key value.
8. The apparatus of claim 7, further comprising:
and the hierarchical cache module is used for hierarchically storing the cache according to the network access record.
9. A server device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any of claims 1-6.
10. A network access processing system, comprising:
the client device is used for sending a network access request to the server device;
the server equipment is used for acquiring a query key value of a network access request of the client equipment; querying in a hierarchical cache according to the query key value; and returning a query result matched with the query key value.
CN202010012423.0A 2020-01-07 2020-01-07 Network access processing method, device, equipment and system Pending CN113158097A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010012423.0A CN113158097A (en) 2020-01-07 2020-01-07 Network access processing method, device, equipment and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010012423.0A CN113158097A (en) 2020-01-07 2020-01-07 Network access processing method, device, equipment and system

Publications (1)

Publication Number Publication Date
CN113158097A true CN113158097A (en) 2021-07-23

Family

ID=76881326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010012423.0A Pending CN113158097A (en) 2020-01-07 2020-01-07 Network access processing method, device, equipment and system

Country Status (1)

Country Link
CN (1) CN113158097A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479207A (en) * 2010-11-29 2012-05-30 阿里巴巴集团控股有限公司 Information search method, system and device
CN102541924A (en) * 2010-12-21 2012-07-04 中国移动通信集团公司 Retrieval information caching method and search engine system
CN102915380A (en) * 2012-11-19 2013-02-06 北京奇虎科技有限公司 Method and system for carrying out searching on data
CN102955786A (en) * 2011-08-22 2013-03-06 北大方正集团有限公司 Method and system for caching and distributing dynamic webpage data
CN103853727A (en) * 2012-11-29 2014-06-11 深圳中兴力维技术有限公司 Method and system for improving large data volume query performance
CN105447171A (en) * 2015-12-07 2016-03-30 北京奇虎科技有限公司 Data caching method and apparatus
CN105530127A (en) * 2015-12-10 2016-04-27 北京奇虎科技有限公司 Method for processing network access request by proxy server and proxy server
CN108132958A (en) * 2016-12-01 2018-06-08 阿里巴巴集团控股有限公司 A kind of multi-level buffer data storage, inquiry, scheduling and processing method and processing device
CN109299087A (en) * 2018-08-14 2019-02-01 中国平安财产保险股份有限公司 Data cache method, device, computer equipment and storage medium
CN109388656A (en) * 2018-09-04 2019-02-26 中国建设银行股份有限公司 Data processing method and system, device and storage medium based on multi-level buffer

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102479207A (en) * 2010-11-29 2012-05-30 阿里巴巴集团控股有限公司 Information search method, system and device
CN102541924A (en) * 2010-12-21 2012-07-04 中国移动通信集团公司 Retrieval information caching method and search engine system
CN102955786A (en) * 2011-08-22 2013-03-06 北大方正集团有限公司 Method and system for caching and distributing dynamic webpage data
CN102915380A (en) * 2012-11-19 2013-02-06 北京奇虎科技有限公司 Method and system for carrying out searching on data
CN103853727A (en) * 2012-11-29 2014-06-11 深圳中兴力维技术有限公司 Method and system for improving large data volume query performance
CN105447171A (en) * 2015-12-07 2016-03-30 北京奇虎科技有限公司 Data caching method and apparatus
CN105530127A (en) * 2015-12-10 2016-04-27 北京奇虎科技有限公司 Method for processing network access request by proxy server and proxy server
CN108132958A (en) * 2016-12-01 2018-06-08 阿里巴巴集团控股有限公司 A kind of multi-level buffer data storage, inquiry, scheduling and processing method and processing device
CN109299087A (en) * 2018-08-14 2019-02-01 中国平安财产保险股份有限公司 Data cache method, device, computer equipment and storage medium
CN109388656A (en) * 2018-09-04 2019-02-26 中国建设银行股份有限公司 Data processing method and system, device and storage medium based on multi-level buffer

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘冰: "ArcGIS Server的石油管道巡检管理系统设计与实现", 《化学工程与装备》, no. 5, pages 126 - 127 *

Similar Documents

Publication Publication Date Title
US20150180872A1 (en) System and method for hierarchical resource permissions and role management in a multitenant environment
US20100312749A1 (en) Scalable lookup service for distributed database
CN102725755B (en) Method and system of file access
US9020892B2 (en) Efficient metadata storage
US20070106405A1 (en) Method and system to provide reference data for identification of digital content
US8396938B2 (en) Providing direct access to distributed managed content
TWI549005B (en) Multi-layer search-engine index
US11100073B2 (en) Method and system for data assignment in a distributed system
WO2015192213A1 (en) System and method for retrieving data
CN112905113A (en) Data access processing method and device
CN105005567A (en) Interest point query method and system
US9311365B1 (en) Music identification
US11625179B2 (en) Cache indexing using data addresses based on data fingerprints
US11645279B2 (en) Index selection for database query
CN111382179B (en) Data processing method and device and electronic equipment
US11580152B1 (en) Using path-based indexing to access media recordings stored in a media storage service
CN113158097A (en) Network access processing method, device, equipment and system
CN110019210B (en) Data writing method and device
US20110231429A1 (en) Configuration information management device, configuration information management method, and computer product
CN111858609A (en) Fuzzy query method and device for block chain
CN113419792A (en) Event processing method and device, terminal equipment and storage medium
CN109376214B (en) Data processing method, device and system, computer equipment and readable medium
US20060282434A1 (en) Access to fragmented database system
CN114443777B (en) Multi-data aggregation management method, device, equipment and system
CN112559447B (en) Interface metadata management system based on distributed file system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination