CN108780458A - A kind of page cache processing method, device and server - Google Patents

A kind of page cache processing method, device and server Download PDF

Info

Publication number
CN108780458A
CN108780458A CN201780015720.7A CN201780015720A CN108780458A CN 108780458 A CN108780458 A CN 108780458A CN 201780015720 A CN201780015720 A CN 201780015720A CN 108780458 A CN108780458 A CN 108780458A
Authority
CN
China
Prior art keywords
access instruction
cache
page
content
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201780015720.7A
Other languages
Chinese (zh)
Inventor
周建平
陈于康
马海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Shenzhen Dajiang Innovations Technology Co Ltd
Original Assignee
Shenzhen Dajiang Innovations Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Dajiang Innovations Technology Co Ltd filed Critical Shenzhen Dajiang Innovations Technology Co Ltd
Publication of CN108780458A publication Critical patent/CN108780458A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/544Buffers; Shared memory; Pipes

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

An embodiment of the present invention provides a kind of page cache processing methods, device, back end application server and server, the method passes through after getting the access instruction of user terminal transmission, prejudge in shared drive whether the caching to match with the access instruction, if there is, the cache contents are fed back to user terminal, if there is no, judge in cache server with the presence or absence of the caching to match with the access instruction, if there is, the cache contents are fed back to user terminal, if in the absence of also, by transferring content corresponding with the access instruction in back end application server, by the data feedback being deployed into user terminal.This improves the response speed of system, its data throughout is increased.

Description

A kind of page cache processing method, device and server
Technical field
This application involves technical field of the computer network, it is more particularly to a kind of page cache processing method, device kimonos Business device.
Background technology
In the prior art, it when user needs to access a network address, is sent to the corresponding server in website by user terminal Access instruction, when the server of website gets the instruction, transferred from back end application server and to user terminal feedback with The web data that access instruction matches.
It is directed to for the larger server of visit capacity, since its visit capacity is huge, server is in one access of every processing It is required to certain processing time when request, therefore, server data throughout in big flow high concurrent situation can be caused low, And it is difficult to realize the quick response to access instruction.
Invention content
In view of this, the present invention provides a kind of page cache processing method, device and servers.
To achieve the above object, the present invention provides the following technical solutions:
A kind of page cache processing method, including:
Obtain access instruction;
When there is caching corresponding with the access instruction in shared drive, return cache content;
If caching corresponding with the access instruction is not present in the shared drive, judge in cache server whether In the presence of caching corresponding with the access instruction;
When there is caching corresponding with the access instruction in cache server, return cache content;
If caching corresponding with the access instruction is not present in the cache server, by back end application server tune It takes and returns to the content to match with the access instruction.
A kind of page cache processing unit, including:
Instruction acquisition unit, for obtaining access instruction;
Shared drive unit, when there is caching corresponding with the access instruction in shared drive, return cache content;
First judging unit is sentenced if caching corresponding with the access instruction is not present for the shared drive It whether there is caching corresponding with the access instruction in disconnected cache server;
Buffer unit is used for when there is caching corresponding with the access instruction in cache server, in return cache Hold;
Back end data transfers unit, if caching corresponding with the access instruction is not present for the cache server When, it is transferred by back end application server and returns to the content to match with the access instruction.
A kind of server, using the page cache processing unit having described in above-mentioned any one.
It can be seen via above technical scheme that compared with prior art, an embodiment of the present invention provides a kind of page caches Processing method, device and server, the method is by after getting the access instruction of user terminal transmission, prejudging altogether Enjoy in memory whether the caching to match with the access instruction, if it does, feed back the cache contents to user terminal, if It is not present, judges with the presence or absence of the caching to match with the access instruction in cache server, if it does, to user terminal The cache contents are fed back, if in the absence of also, by transferring content corresponding with the access instruction in back end application server, By the data feedback being deployed into user terminal.By aforesaid way as it can be seen that user is when accessing webpage, preferentially by described shared interior Deposit and transfer corresponding cache contents in cache server, only when in the shared drive and cache server be not present pair The cache contents answered are just therefore to be carried by transferring data corresponding with the access instruction in the back end application server The high response speed of system, increases its data throughout.
Description of the drawings
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this The embodiment of invention for those of ordinary skill in the art without creative efforts, can also basis The attached drawing of offer obtains other attached drawings.
Fig. 1 is a kind of flow diagram of page cache processing method disclosed in the embodiment of the present application;
Fig. 2 is a kind of flow diagram for page cache processing method that another embodiment of the application provides;
Fig. 3 is a kind of structural schematic diagram of page cache processing unit provided by the embodiments of the present application.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall within the protection scope of the present invention.
In order to improve Web application frameworks in the response speed of big flow high concurrent, its handling capacity, the application are improved Disclose a kind of page cache processing method, device and server.
As shown in Figure 1, the concrete operations flow of page cache processing method disclosed in the embodiment of the present application may include:
Step S101:Obtain access instruction;
The access instruction is that, according to the access instruction for the access request generation that user terminal is sent, which can To refer to computer, mobile phone etc., when user uses the user terminal access webpage, sends one to the server side of webpage and visit It asks instruction, may include the URL accessed and country in the access request, it can be according to the URL in this method (UniformResource Locator) and country generate corresponding key, using the key of generation as access instruction.The service Device side can transfer the target webpage to match with the URL and country according to the key.
By taking the electric business page as an example, accesses user and may be from every country in the world, the all-purpose language of each country It is different with text, therefore, in order to facilitate user's normal browsing page of every country, it can be adopted in technical solution disclosed in the present application With the corresponding multiple pages of unified URL (Uniform Resource Identifier), each page corresponds to different country or state The combination of family and language, by the URL of access request and country or by URL, country and the corresponding unique key of language, by this Unique key is assured that the target webpage corresponding to this access request.
Step S102:Judge the caching to match with the access instruction whether is stored in shared drive, if so, holding Row step S103, it is no to then follow the steps S104;
The shared drive can be realized by Web caching technologies, Web cachings refer to a web resource (page html, Data etc.) it is present in the copy between server and client side.Caching can preserve the copy of output content according to the request come in; When next request is come, URL and the country of the request are obtained, judges two requests (next request and upper one A request) whether URL having the same, if it is identical URL, caching can according to caching mechanism determine be directly use copy Access request is responded, or sends request again to server.
It is cached based on Web, in technical solution disclosed in the embodiment of the present application, for storing some visits in the shared drive Ask the copy for instructing the corresponding page, using the copy of the page as the cache contents of shared drive, i.e., in this application, institute It is complete web page contents to state the copy cached in shared drive, when getting the access instruction of user terminal transmission, is judged Whether there is the page copy that the access instruction is accessed in shared drive, that is, judge in the shared drive whether In the presence of with the access instruction URL and country corresponding to webpage copy, if it does, execute step S103, be not necessarily to and clothes Business device, which carries out data interaction, to be directly sent to the user terminal the copy of the webpage.
In technical solution disclosed in the embodiment of the present application, the cache contents in the shared drive can be needed according to user Seek sets itself, for example, its may include the page copy accessed in nearest preset time period, access frequency be more than setting value Page copy, preset specific webpage page copy or in nearest a period of time visit capacity be more than preset value page copy Deng.
Step S103:The caching is fed back to use by the caching to match with the access instruction in extraction shared drive Family terminal;
When there is caching corresponding with the access instruction in shared drive, directly by being transferred in the shared drive pair The caching is fed back to the mobile terminal by the caching answered as the response data of the access instruction.
Further, when the cache contents in the shared drive do not update within a longer period or because When causing content to fail for other reasons, no normal direction user feedback or the content to user feedback mistake can be caused, in view of this, When there are the cache contents to match with the access instruction in the shared drive, judge corresponding with the access instruction Whether cache contents are effective, if so, extracting and recognizing if cache contents fail to user terminal return cache content For cache contents corresponding with the access instruction are not present in shared drive, step S104 is executed.
Step S104:Judge to whether there is caching corresponding with the access instruction in cache server, if so, executing Otherwise step S105 executes step S106;
In this step, right when page copy corresponding with the access instruction is not present in the shared drive Cache contents in cache server in server carry out screening, judge whether deposited in the cache contents of the cache server Corresponding subsequent step is executed in the page copy to match with the access instruction, then according to judging result.
In technical solution disclosed in the embodiment of the present application, the cache server also refers to redis databases, institute The cache contents stated in redis databases can be according to user demand sets itself, for example, it may include nearest preset time The page copy that was accessed in section, access frequency be more than the page copy of setting value, preset specific webpage page copy or Visit capacity is more than the page copy etc. of preset value in nearest a period of time.
Step S105:The caching to match with the access instruction in extraction cache server feeds back to the caching User terminal;
Further, when the cache contents in the cache server do not update within a longer period, or Because when other reasons cause content to fail, no normal direction user feedback or the content to user feedback mistake can be caused, be directed to This judges caching corresponding with the access instruction when there is caching corresponding with the access instruction in cache server Whether effectively, it if so, to user terminal return cache content, is not present and the visit if not, determining in cache server It asks instruction corresponding caching, executes step S106.
Step S106:It is transferred from back end application server and into user terminal returns and the access instruction matches Hold;
It is stored with all web page contents in back end application server, does not have when in the shared drive and cache server It, can be direct from the database in the back end application server when finding the webpage copy to match with the access instruction The webpage corresponding to the access instruction is transferred, and webpage is fed back into user terminal, while feedback content can also be written In cache server or shared drive.
In said program, whether after getting the access instruction of user terminal transmission, prejudging in shared drive has The caching to match with the access instruction, if it does, the cache contents are fed back to user terminal, if it does not, judging With the presence or absence of the caching to match with the access instruction in cache server, if it does, feeding back the caching to user terminal Content, if in the absence of also, by transferring content corresponding with the access instruction in back end application server, by what is be deployed into Data feedback is to user terminal.By aforesaid way as it can be seen that user is when accessing webpage, preferentially taken by the shared drive and caching Corresponding cache contents are transferred in business device, only when there is no in corresponding caching in the shared drive and cache server Rong Shi just transfers data corresponding with the access instruction from the back end application server, and this improves systems Response speed increases its data throughout.
In technical solution disclosed in the above embodiments of the present application, webpage is accessed in order to facilitate Foreign User, improves external use Family accesses the efficiency of webpage, can pre-set a shared drive and cache server, the shared drive for each country Only receive user's access instruction of the country with cache server, the user of the country in advance refers to access when accessing webpage Order is sent to the shared drive, and when in the shared drive without corresponding caching, access instruction is sent to caching clothes Business device, the back end application server is sent to when in cache server without also corresponding caching, then by access instruction.And And in order to realize the quick transmission of data, access speed is improved, the shared drive, cache server and the backend application take Network special line may be used between business device, that is, by dedicated between shared drive, cache server and back end application server Data transmission channel is communicated.
It, can be to the caching in the shared drive and cache server in technical solution disclosed in the embodiment of the present application Content is according to different rule setting cache contents.That is, server end memory buffers number by the way of server-side caching layering According to, for example, by static page data be stored in shared drive and cache server in.It is preferential to take shared drive followed by slow Server is deposited, if shared drive and cache server are not hit by the access instruction, Hui Yuan to back end application server.
Preferably, it in technical solution disclosed in the above embodiments of the present application, is being deployed by back end application server and institute It is directly anti-to user terminal when for the ease of receiving the access instruction again next time when stating the content that access instruction matches It presents the content, in technical solution disclosed in the above embodiments of the present application, can also include:
The caching clothes are written in the content to match with the access instruction that the back end application server is deployed into Business device.At this point, the data content stored in the cache server is the content being accessed in nearest preset time period.
Preferably, in technical solution disclosed in the above embodiments of the present application, the cache contents in the shared drive can Think that dsc data, the dsc data are the data content that access frequency is greater than the set value.Timely update the heat for convenience Data, in method disclosed in the above embodiments of the present application, referring to Fig. 2, when from cache server or back end application server to When family terminal feedback accesses content, the above method can also include:
Step S201:Count the corresponding access frequency for accessing content of the access instruction got;
Step S202:Whether judgement access frequency is greater than the set value, if so, continuing to execute.
Step S203:Content will be accessed to store into shared drive.
When statistics accesses the access frequency of content, detailed process can be:The access instruction being analyzed and acquired by, obtains Target webpage corresponding to each access instruction, specifically, URL, country or URL included by access instruction, state can be passed through Family and language determine the target webpage, after the target webpage determines, judge to preset in access list with the presence or absence of the mesh Webpage is marked, if it does, the visit capacity of the target webpage adds 1, if it does not, in the default access list described in addition Target webpage, meanwhile, the visit capacity of the target webpage adds 1, according to the time of each access instruction got, is calculated The access frequency of each target webpage, the access frequency of the target webpage are to access the access frequency of content.When judgement is visited When asking that frequency is greater than the set value, will be corresponding with the access instruction by the cache server or back end application server Content feed is given after the user terminal, can also the shared drive be written in the content that matched with the access instruction In.Since user's access instruction can pass through the shared drive before being transferred to back end application server, the application is real It applies in example, the default access list may be provided at shared drive end.
Preferably, the shared drive rejects some non-dsc datas in time for convenience, in the above method, can also include Each data cached access frequency in shared drive described in real-time statistics, when detecting that the access frequency of a certain cache contents is low When setting value, which is removed into the shared drive.
In actual use, back end application server operation maintenance personnel may carry out more certain configuration contents in the page Newly, it adjusts, such as operation maintenance personnel needs reach the standard grade new product or offline certain old money products;Or back end application server O&M Timed task is arranged in personnel in the background, for example, periodically occurring delivery is reminded or reservation is reminded etc. in webpage, so that rear end is answered Be adjusted according to the timed task with the configuration content in the page stored in server, in order to timely update shared drive and The page of caching in cache server, the above method can also include:
When the data content stored in back end application server changes, back end application server can be caused corresponding Web page contents change, at this point, if be not updated to the content in the shared drive and cache server, can make User is by getting wrong data in shared drive and cache server, in this regard, in order to bring preferably user's body to user It tests, in technical solution disclosed in the embodiment of the present application, the backend application database data content can also be monitored in real time, judge Whether the content in backend application database updates, if so, being calculated according to the mapping relations between preset model and the page The corresponding page of more new content in backend application database is obtained, and content of pages is updated, is judged described shared interior Deposit and/or cache server in whether be stored with the page, if so, according to the updated page of back end application server to institute The page stored in shared drive and cache server is stated to be updated.
Below by the electric business page to illustrating in a manner of above-mentioned renewal of the page:
Preset relationship model is stored in back end application server, the relationship model refers to commodity and the page, quotient Mapping relations between product and commodity, by record cast relationship, when the memory page in back end application server it is corresponding certain When the parameter of one commodity is adjusted, the association page cache data needs for quoting it are updated, and can be closed at this time by the model System adjusts the content of pages for the page that the commodity are corresponding, are stored in back end application server.Judge whether the newer page delays It is shared in slow and/or cache server there are described, if so, according to the updated page of back end application server to described total The page stored in memory and cache server is enjoyed to be updated.
Corresponding to the above method, disclosed herein as well is a kind of page cache processing unit, referring to Fig. 3, which can be with Applied to shared drive and caching server end, specifically, the device may include:
Instruction acquisition unit 100, for obtaining access instruction;
Shared drive unit 200, when there is caching corresponding with the access instruction in shared drive, in return cache Hold;
First judging unit 300, if caching corresponding with the access instruction is not present for the shared drive, Judge to whether there is caching corresponding with the access instruction in cache server;
Buffer unit 400 is used for when there is caching corresponding with the access instruction in cache server, return cache Content;
Back end data transfers unit 500, if there is no corresponding with the access instruction for the cache server When caching, user's access instruction is sent to back end application server end, to be transferred by back end application server and be returned and institute State the content that access instruction matches.
Preferably, the Back end data transfers unit 500 and is being transferred by back end application server and returned and the access After instructing the content to match, it is additionally operable to:
The caching clothes are written in the content to match with the access instruction that the back end application server is deployed into Business device.
Preferably, the Back end data transfers unit and is being transferred by back end application server and returned and the access instruction After the content to match, it is additionally operable to:
The content write-in to match with the access instruction that the back end application server is deployed into is described shared interior In depositing.
Preferably, instruction acquisition unit 100, is specifically used for:
To get user terminal input access request parse, obtain needed for access URL and country ID or URL, country ID and language will be described according to the URL keys corresponding with country ID or URL, country ID and language generation accessed Key is as access instruction.
Preferably, further include in above-mentioned apparatus:
Hot pages buffer unit is stored for will meet the page to impose a condition into shared drive.Specifically, described Hot pages buffer unit is used for:The corresponding access frequency for accessing content of access instruction that real-time statistics are got, works as judgement When the access frequency of certain content is greater than the set value, which is stored into shared drive.The access frequency of content is accessed in statistics When rate, detailed process can be:The access instruction being analyzed and acquired by obtains the target webpage corresponding to each access instruction, Specifically, by the URL included by access instruction the target webpage can be determined with country or URL, country ID and language, when After the target webpage determines, judge to whether there is the target webpage in default access list, if it does, the target webpage Visit capacity adds 1, if it does not, the target webpage is added in the default access list, meanwhile, the target webpage Visit capacity adds 1, and according to the time of each access instruction got, the access frequency of each target webpage is calculated, described The access frequency of target webpage is to access the access frequency of content.When judging that access frequency is greater than the set value, by described Cache server or back end application server will in the corresponding content feed of the access instruction to after the user terminal, It can also be written in the shared drive with the content that the access instruction matches.
Hot pages buffer unit can be also used for each data cached access frequency in shared drive described in real-time statistics The cache contents are removed the shared drive by rate when detecting that the access frequency of a certain cache contents is less than setting value.
Preferably, corresponding with the above method, the shared drive unit is specifically used for:
When there are cache contents corresponding with the access instruction in shared drive, judge corresponding with the access instruction Cache contents it is whether effective, if so, return cache content, be not present in shared drive and the access if not, determining Instruct corresponding cache contents.
Preferably, corresponding with the above method, the buffer unit is specifically used for:
When there is caching corresponding with the access instruction in cache server, judge corresponding with the access instruction Whether caching is effective, if so, return cache content, is not present and the access instruction if not, determining in cache server Corresponding caching.
It is corresponding with the above method, in above-mentioned apparatus, can also include:
Whether updating unit, the content for judging in backend application database update, if so, according to preset model The corresponding page of more new content in backend application database is calculated in mapping relations between the page, according to more new content Caching in shared drive and cache server described in page furbishing corresponding with more new content.
Corresponding to above-mentioned apparatus, disclosed herein as well is a kind of server, which has above-mentioned any one institute The page cache processing unit stated.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning Covering non-exclusive inclusion, so that the process, method, article or equipment including a series of elements includes not only that A little elements, but also include other elements that are not explicitly listed, or further include for this process, method, article or The intrinsic element of equipment.In the absence of more restrictions, the element limited by sentence "including a ...", is not arranged Except there is also other identical elements in the process, method, article or apparatus that includes the element.
Each embodiment is described by the way of progressive in this specification, the highlights of each of the examples are with other The difference of embodiment, just to refer each other for identical similar portion between each embodiment.
The foregoing description of the disclosed embodiments enables professional and technical personnel in the field to realize or use the application. Various modifications to these embodiments will be apparent to those skilled in the art, as defined herein General Principle can in other embodiments be realized in the case where not departing from spirit herein or range.Therefore, the application It is not intended to be limited to the embodiments shown herein, and is to fit to and the principles and novel features disclosed herein phase one The widest range caused.

Claims (23)

1. a kind of page cache processing method, which is characterized in that including:
Obtain access instruction;
When there is caching corresponding with the access instruction in shared drive, return cache content;
If caching corresponding with the access instruction is not present in the shared drive, judge to whether there is in cache server Caching corresponding with the access instruction;
When there is caching corresponding with the access instruction in cache server, return cache content;
If caching corresponding with the access instruction is not present in the cache server, transferred simultaneously by back end application server Return to the content to match with the access instruction.
2. page cache processing method according to claim 1, which is characterized in that described to be transferred by back end application server And after returning to the content to match with the access instruction, further include:
The cache server is written in the content to match with the access instruction that the back end application server is deployed into.
3. page cache processing method according to claim 1, which is characterized in that described to be transferred by back end application server And after returning to the content to match with the access instruction, further include:
The content to match with the access instruction that the back end application server is deployed into is written in the shared drive.
4. page cache processing method according to claim 1, which is characterized in that access instruction is obtained, including:
The URL and country ID that acquisition accesses generate corresponding key, using the key as access according to the URL and country ID Instruction.
5. page cache processing method according to claim 1, which is characterized in that be stored with access in the shared drive Frequency meets the page to impose a condition.
6. page cache processing method according to claim 5, which is characterized in that the setting condition is:Access frequency More than preset value.
7. page cache processing method according to claim 1, which is characterized in that when in shared drive exist and the visit Ask that return cache content specifically includes when instructing corresponding caching:
When there is caching corresponding with the access instruction in shared drive, judge that caching corresponding with the access instruction is It is no effective, if so, return cache content, if not, determining that there is no corresponding with the access instruction slow in shared drive It deposits.
8. page cache processing method according to claim 1, which is characterized in that when in cache server exist with it is described When the corresponding caching of access instruction, return cache content specifically includes:
When there is caching corresponding with the access instruction in cache server, caching corresponding with the access instruction is judged It is whether effective, if so, return cache content, if not, determining that there is no corresponding with the access instruction in cache server Caching.
9. page cache method according to claim 1, which is characterized in that each cache server and shared drive correspond to In a country, that is, the access instruction of each cache server and shared server for responding user in corresponding country.
10. page cache method according to claim 1, which is characterized in that each cache server and shared drive with Data interaction is carried out by special line between back end application server.
11. page cache method according to claim 1, which is characterized in that further include:
Judge whether the content in backend application database updates, if so, according to the mapping between preset model and the page Relationship is calculated the corresponding page of more new content in backend application database, according to more new content and corresponding with more new content Page furbishing described in caching in shared drive and cache server.
12. a kind of page cache processing unit, which is characterized in that including:
Instruction acquisition unit, for obtaining access instruction;
Shared drive unit, when there is caching corresponding with the access instruction in shared drive, return cache content;
First judging unit judges slow if caching corresponding with the access instruction is not present for the shared drive It deposits and whether there is caching corresponding with the access instruction in server;
Buffer unit is used for when there is caching corresponding with the access instruction in cache server, return cache content;
Back end data transfers unit, if caching corresponding with the access instruction is not present for the cache server, It is transferred by back end application server and returns to the content to match with the access instruction.
13. page cache processing unit according to claim 12, which is characterized in that the Back end data transfers unit and exists After being transferred by back end application server and returned to the content to match with the access instruction, it is additionally operable to:
The cache server is written in the content to match with the access instruction that the back end application server is deployed into.
14. page cache processing unit according to claim 12, which is characterized in that the Back end data transfers unit and exists After being transferred by back end application server and returned to the content to match with the access instruction, it is additionally operable to:
The content to match with the access instruction that the back end application server is deployed into is written in the shared drive.
15. page cache processing unit according to claim 12, which is characterized in that instruction acquisition unit is specifically used for:
The URL and country ID that acquisition accesses generate corresponding key, using the key as access according to the URL and country ID Instruction.
16. page cache processing unit according to claim 15, which is characterized in that further include:
Hot pages buffer unit is stored for will meet the page to impose a condition into shared drive.
17. page cache processing unit according to claim 16, which is characterized in that the hot pages buffer unit, It is specifically used for:
The page that access frequency is more than to preset value is stored into shared drive.
18. page cache processing unit according to claim 12, which is characterized in that the shared drive unit is specifically used In:
When there is caching corresponding with the access instruction in shared drive, judge that caching corresponding with the access instruction is It is no effective, if so, return cache content, if not, determining that there is no corresponding with the access instruction slow in shared drive It deposits.
19. page cache processing unit according to claim 12, which is characterized in that the buffer unit is specifically used for:
When there is caching corresponding with the access instruction in cache server, caching corresponding with the access instruction is judged It is whether effective, if so, return cache content, if not, determining that there is no corresponding with the access instruction in cache server Caching.
20. page cache processing unit according to claim 12, which is characterized in that the shared drive unit and caching The quantity of unit is multiple, and each shared drive unit and buffer unit correspond to a country, that is, each shared drive unit It is used to respond the access instruction of user in corresponding country with buffer unit.
21. page cache device according to claim 12, which is characterized in that each shared drive unit and buffer unit It is transferred with Back end data and data interaction is carried out by special line between unit.
22. page cache device according to claim 12, which is characterized in that further include:
Whether updating unit, the content for judging in backend application database update, if when, according to preset model and page Mapping relations between face are calculated the corresponding page of more new content in backend application database, according to more new content and with Caching in shared drive and cache server described in the corresponding page furbishing of more new content.
23. a kind of server, which is characterized in that using the page cache processing dress having the right described in requirement 12-22 any one It sets.
CN201780015720.7A 2017-12-08 2017-12-08 A kind of page cache processing method, device and server Pending CN108780458A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/115189 WO2019109326A1 (en) 2017-12-08 2017-12-08 Page cache processing method and device, and server

Publications (1)

Publication Number Publication Date
CN108780458A true CN108780458A (en) 2018-11-09

Family

ID=64034061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780015720.7A Pending CN108780458A (en) 2017-12-08 2017-12-08 A kind of page cache processing method, device and server

Country Status (2)

Country Link
CN (1) CN108780458A (en)
WO (1) WO2019109326A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933585A (en) * 2019-02-22 2019-06-25 京东数字科技控股有限公司 Data query method and data query system
CN111291079A (en) * 2020-02-20 2020-06-16 京东数字科技控股有限公司 Data query method and device
CN112035479A (en) * 2020-08-31 2020-12-04 平安医疗健康管理股份有限公司 Medicine database access method and device and computer equipment
CN114281434A (en) * 2021-12-15 2022-04-05 创优数字科技(广东)有限公司 Applet user information management method, device, computer equipment and storage medium
CN117615013A (en) * 2024-01-19 2024-02-27 杭州优云科技股份有限公司 File searching method, device, equipment and readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117309A (en) * 2010-01-06 2011-07-06 卓望数码技术(深圳)有限公司 Data caching system and data query method
CN107305576A (en) * 2016-04-25 2017-10-31 北京京东尚科信息技术有限公司 The pseudo- static treatment method and apparatus of the page

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6877025B2 (en) * 2000-12-18 2005-04-05 International Business Machines Corp. Integrated JSP and command cache for web applications with dynamic content
CN101154230B (en) * 2006-09-30 2010-08-18 中兴通讯股份有限公司 Responding method for large data volume specified searching web pages
CN102368258B (en) * 2011-09-30 2014-11-26 广州市动景计算机科技有限公司 Webpage page caching management method and system
CN106210022A (en) * 2016-06-29 2016-12-07 天涯社区网络科技股份有限公司 A kind of system and method for processing forum's height concurrent data requests

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102117309A (en) * 2010-01-06 2011-07-06 卓望数码技术(深圳)有限公司 Data caching system and data query method
CN107305576A (en) * 2016-04-25 2017-10-31 北京京东尚科信息技术有限公司 The pseudo- static treatment method and apparatus of the page

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933585A (en) * 2019-02-22 2019-06-25 京东数字科技控股有限公司 Data query method and data query system
CN109933585B (en) * 2019-02-22 2021-11-02 京东数字科技控股有限公司 Data query method and data query system
CN111291079A (en) * 2020-02-20 2020-06-16 京东数字科技控股有限公司 Data query method and device
WO2021164487A1 (en) * 2020-02-20 2021-08-26 京东数字科技控股股份有限公司 Data query method and apparatus
CN112035479A (en) * 2020-08-31 2020-12-04 平安医疗健康管理股份有限公司 Medicine database access method and device and computer equipment
CN114281434A (en) * 2021-12-15 2022-04-05 创优数字科技(广东)有限公司 Applet user information management method, device, computer equipment and storage medium
CN114281434B (en) * 2021-12-15 2022-11-29 创优数字科技(广东)有限公司 Applet user information management method, apparatus, computer device and storage medium
CN117615013A (en) * 2024-01-19 2024-02-27 杭州优云科技股份有限公司 File searching method, device, equipment and readable storage medium
CN117615013B (en) * 2024-01-19 2024-04-19 杭州优云科技股份有限公司 File searching method, device, equipment and readable storage medium

Also Published As

Publication number Publication date
WO2019109326A1 (en) 2019-06-13

Similar Documents

Publication Publication Date Title
CN108780458A (en) A kind of page cache processing method, device and server
CN104468807B (en) Carry out processing method, high in the clouds device, local device and the system of web cache
CN106161569B (en) Recommendation, buffer replacing method and the equipment of Web content
CN102523285B (en) Storage caching method of object-based distributed file system
CN103812849B (en) A kind of local cache update method, system, client and server
CN106599239A (en) Webpage content data acquisition method and server
CN107465707A (en) A kind of content refresh method and device of content distributing network
US8225192B2 (en) Extensible cache-safe links to files in a web page
CN107066570A (en) Data managing method and device
CN102449628A (en) Architectural pattern for persistent web application design
CN102480397A (en) Method and equipment for accessing internet pages
TW200426625A (en) A transparent edge-of-network data cache
CN104133783B (en) Method and device for processing distributed cache data
CN102882974A (en) Method for saving website access resource by website identification version number
CN103916474B (en) The definite method, apparatus and system of cache-time
CN103152367A (en) Cache dynamic maintenance updating method and system
CN107302573A (en) A kind of information-pushing method, device, electronic equipment and storage medium
CN110471939A (en) Data access method, device, computer equipment and storage medium
CN105354258B (en) A kind of device and method updating website data caching
CN110191168A (en) Processing method, device, computer equipment and the storage medium of online business datum
CN106777085A (en) A kind of data processing method, device and data query system
CN104850627A (en) Method and apparatus for performing paging display
WO2012075961A1 (en) Method, device, and system for acquiring start page
US20170329708A1 (en) Performing efficient cache invalidation
CN106464710A (en) Profile-based cache management

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181109

WD01 Invention patent application deemed withdrawn after publication