CN105512251A - Page cache method and device - Google Patents

Page cache method and device Download PDF

Info

Publication number
CN105512251A
CN105512251A CN201510867028.XA CN201510867028A CN105512251A CN 105512251 A CN105512251 A CN 105512251A CN 201510867028 A CN201510867028 A CN 201510867028A CN 105512251 A CN105512251 A CN 105512251A
Authority
CN
China
Prior art keywords
page
current accessed
buffer area
described current
priority valve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510867028.XA
Other languages
Chinese (zh)
Other versions
CN105512251B (en
Inventor
谢海洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201510867028.XA priority Critical patent/CN105512251B/en
Publication of CN105512251A publication Critical patent/CN105512251A/en
Application granted granted Critical
Publication of CN105512251B publication Critical patent/CN105512251B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Memory System Of A Hierarchy Structure (AREA)

Abstract

The invention discloses a page cache method and device; the method comprises the following steps: obtaining history access state data of a terminal present access page; calculating present access page priority value according to the history access state data of the present access page; determining a cache region cache page according to the present access page priority value and cache region idle cache resources. The method can cache pages according to different user habits, can effectively utilize limited cache region resources, thus improving page average loading speed and page switching speed, and improving user experiences.

Description

A kind of page cache method and device
Technical field
The embodiment of the present invention relates to communication technical field, particularly relates to a kind of page cache method and device.
Background technology
In current mobile terminal, each application program is more and more diversified, application program page layout becomes increasingly complex, and page quantity is increasing in use for user, makes application program runnability more and more lower, be mainly manifested in committed memory increasing, page layout switch is more and more slower.
The slow-footed problem of page layout switch is solved mainly through two kinds of methods in prior art, one is by direct for page ramming system page stack, by the buffer memory of the operating system maintain pages of mobile terminal, operating system can be destroyed partial page when system resource is not enough and be carried out releasing resource.But the unpredictable page cache of this method destroys the time, and when user needs to be switched to certain page, this page is likely destroyed by operating system.Two is that user adopts First Input First Output algorithm to carry out buffer memory to the page, this method due to cache resources limited, complex page is played up consuming time longer, page layout switch speed can be had a strong impact on.
Summary of the invention
The invention provides a kind of page cache method and device, to realize making full use of cache resources, improve the effect of page layout switch speed.
First aspect, embodiments provides a kind of page cache method, comprising:
Obtain the history Access status data of the terminal current accessed page;
According to the history Access status data of the described current accessed page, calculate the priority valve of the described current accessed page;
According to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
Second aspect, the embodiment of the present invention additionally provides a kind of page cache device, comprising:
History Access status data acquisition module, for obtaining the history Access status data of the terminal current accessed page;
Computing module, for the history Access status data according to the described current accessed page, calculates the priority valve of the described current accessed page;
Buffer memory page determination module, for the buffer memory page according to the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
The present invention is by obtaining the history Access status data of the terminal current accessed page, according to the history Access status data of the described current accessed page, calculate the priority valve of the described current accessed page, and according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.Because the history Access status data of the current accessed page reflect the custom that user opens the page, therefore can according to the difference custom buffer memory page of different user by said method, the buffer area resource that efficiency utilization is limited, improve average loading velocity and the page layout switch speed of the page, improve Consumer's Experience.
Accompanying drawing explanation
The process flow diagram of a kind of page cache method that Fig. 1 provides for the embodiment of the present invention one;
The process flow diagram of a kind of page cache method that Fig. 2 provides for the embodiment of the present invention two;
The mechanism principle schematic diagram of a kind of page cache method that Fig. 3 provides for the embodiment of the present invention three;
The structural representation of a kind of page cache device that Fig. 4 provides for the embodiment of the present invention four.
Embodiment
Below in conjunction with drawings and Examples, the present invention is described in further detail.Be understandable that, specific embodiment described herein is only for explaining the present invention, but not limitation of the invention.It also should be noted that, for convenience of description, illustrate only part related to the present invention in accompanying drawing but not entire infrastructure.
Before in further detail exemplary embodiment being discussed, it should be mentioned that some exemplary embodiments are described as the process or method described as process flow diagram.Although operations (or step) is described as the process of order by process flow diagram, many operations wherein can be implemented concurrently, concomitantly or simultaneously.In addition, the order of operations can be rearranged.Described process can be terminated when its operations are completed, but can also have the additional step do not comprised in the accompanying drawings.Described process can correspond to method, function, code, subroutine, subroutine etc.
Also it should be mentioned that and to replace in implementation at some, the function/action mentioned can according to being different from occurring in sequence of indicating in accompanying drawing.For example, depend on involved function/action, in fact each width figure in succession illustrated can perform simultaneously or sometimes can perform according to contrary order substantially.
Embodiment one
The process flow diagram of a kind of page cache method that Fig. 1 provides for the embodiment of the present invention one, the present embodiment is applicable to the caching situation to the application program for mobile terminal page, the method can be performed by page cache device, described device can be realized by the mode of hardware and/or software, and described method specifically comprises following operation:
The history Access status data of S110, the acquisition terminal current accessed page.
The history Access status data of the terminal current accessed page have reacted the page access custom of user, can also reflect the otherness of user habit.The function of the application program in existing terminal gets more and more, but for certain user, the function possibility used is also few, and such as, in map class application, the user often gone on business may be more with the page of hotel and route; The user of car owners may be more with the page of navigation and route etc.; Some users are also had may often to open the public transport page.This operation can learn behavioural habits and the needs of different user by the history Access status data obtaining the terminal current accessed page.
S120, history Access status data according to the described current accessed page, calculate the priority valve of the described current accessed page.
Calculate the priority valve of the described current accessed page according to the history Access status data of the current accessed page obtained in operation S110, such as, user often can be opened and but load the higher priority valve of slower page imparting.
S130, the buffer memory page according to the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
This priority valve reflects the cache priority level of the page, such as can priority cache priority valve is higher in buffer area the page.
The embodiment of the present invention is by calculating the priority valve of the described current accessed page according to the history Access status data of the current accessed page, and according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area, because the history Access status data of the current accessed page reflect the custom that user opens the page, therefore can according to the difference custom buffer memory page of different user by said method, the buffer area resource that efficiency utilization is limited.Because the buffer memory page in buffer area is determined by the priority valve of the described current accessed page calculated according to the history Access status data of the current accessed page, therefore user directly can load the page of buffer area buffer memory when switching the page, improve average loading velocity and the page layout switch speed of the page, improve Consumer's Experience.In addition, the method only needs can complete in terminal local, does not need the support of service end, and process is simple, without the need to the reciprocal process of complexity.
On the basis of above-described embodiment, optionally, described caching page face comprises: user interface and/or page data.
On the basis of above-described embodiment, optionally, the history Access status data of the described current accessed page comprise: the access frequency of the described current accessed page when described terminal history is accessed and average render time.
In each application program in terminal, function pages has very large otherness, and some page data is many, but page user interface result is simple, and some page user interface is complicated, and data volume is little, also some page user interface and data volume all very large.The otherness of the page causes the loading velocity of some page to be much slower than the average response speed of whole application program, and has some page response speed quickly, and namely the average render time of each page is different.Therefore the present embodiment obtains the access frequency of the current accessed page when described terminal history is accessed and average render time to react the custom of user to access pages simultaneously.
Concrete, can choose the history of the current accessed page in described terminal one day or month accessed time access frequency and corresponding average render time.The access frequency of the such as current accessed page when described terminal history is accessed be the previous day access frequency and according to the current accessed page in accessed average render time described terminal the previous day.
It should be noted that, the history Access status data of the described current accessed page can also be: the described current accessed page described terminal current accessed time access frequency and average render time.Such as, the described current accessed page described terminal current accessed time access frequency be that the access frequency of the previous day adds 1, to be the current accessed page accessed in described terminal the previous day and the mean value of current accessed render time for described average render time, and namely the statistics of access frequency and average render time includes this current accessed.
Preferably, the described history Access status data according to the described current accessed page, calculate the priority valve of the described current accessed page, comprising: using the priority valve of the product of the access frequency of the current accessed page when described terminal history is accessed and average render time as the described current accessed page.Access frequency and average render time larger, priority valve is larger, and this page is more preferentially cached to buffer area.This optimal way calculates the priority valve of the current accessed page according to the custom of user, record access frequency when described terminal history is accessed of the current accessed page and average render time simultaneously, the custom of the page and the average render time of the page is opened by comprehensively analyzing user, calculate the priority valve of the current accessed page, and be recorded in this locality, then buffer memory is carried out according to the priority valve of the described current accessed page, those are played up length consuming time, the page that user commonly uses again simultaneously carries out priority cache, effectively can improve the cost performance of buffer memory, make full use of buffer space, the access fluency of each application program in the raising terminal of maximum possible.It should be noted that, can also using the current accessed page described terminal current accessed time access frequency and the product of average render time as the priority valve of the described current accessed page.
Embodiment two
The process flow diagram of a kind of page cache method that Fig. 2 provides for the embodiment of the present invention two, as shown in Figure 2, described method comprises:
The history Access status data of S210, the acquisition terminal current accessed page.
S220, history Access status data according to the described current accessed page, calculate the priority valve of the described current accessed page.
S230, when the free buffer resource of described buffer area is less than the resource of the described current accessed page, the priority valve of the buffer memory page in the priority valve of the more described current accessed page and described buffer area, according to comparative result, discharge preferred weights in described buffer area and be less than at least one buffer memory page of described current accessed page priority valve, and the resource of the described current accessed page is added into described buffer area.
When the free buffer resource of described buffer area is less than the resource of the described current accessed page, illustrate that described buffer area has been tending towards saturated, the priority valve of the buffer memory page therefore in the priority valve of the more described current accessed page and described buffer area, according to comparative result, discharge preferred weights in described buffer area and be less than at least one buffer memory page of described current accessed page priority valve, and the resource of the described current accessed page is added into described buffer area.
Concrete, when the priority valve of the described current accessed page is greater than the priority valve of at least one the buffer memory page in described buffer area, illustrate that the cache priority level of the current accessed page is less than the cache priority level of current accessed page priority valve higher than the priority valve of those buffer memory pages, therefore, discharge preferred weights in described buffer area and be less than at least one buffer memory page of described current accessed page priority valve, the quantity of the embodiment of the present invention to the buffer memory page of release is not construed as limiting, as long as the resource of the buffer memory page of release is greater than or equal to the resource of the current accessed page, then the resource of the current accessed page is added into described buffer area.
The embodiment of the present invention is by calculating the buffer memory page of the priority valve determination buffer area of the current accessed page obtained, the page that priority cache priority valve is higher, when system cache inadequate resource, the page that preferential Release priority weights are lower, can the limited buffer area resource of efficiency utilization, improve average loading velocity and the page layout switch speed of the page, improve Consumer's Experience.
It should be noted that, when the free buffer resource of buffer area is greater than or equal to the resource of the described current accessed page, directly described current accessed page cache is entered in described buffer area.
On the basis of above-described embodiment, optionally, after according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area, also comprise: upgrade history Access status data corresponding to each caching page face of described buffer area and priority valve.
History Access status data corresponding for each caching page face and priority valve are recorded in this locality, with the comparing of the priority valve of each buffer memory page when facilitating subsequent page to access.
On the basis of above-described embodiment, optionally, described method also comprises: when obtaining page layout switch instruction, inquiring about described buffer area and whether being cached with target pages corresponding to described page layout switch instruction, if so, then loading the target pages of described buffer area buffer memory.
Embodiment three
The mechanism principle schematic diagram of a kind of page cache method that Fig. 3 provides for the embodiment of the present invention three, as shown in Figure 3, when user uses the terminal access page, recording user accesses the history Access status data of each page, and the average render time Tn of such as page access frequency An and each page is also stored in this locality.Then use the average render time Tn of the frequency An of the page and the page to calculate the priority valve Qn=An*Tn of the page by user, wherein n is positive integer.Perform cache policy, the page that the priority valve of the priority cache page is high, when buffer area cache resources is not enough, the page that the priority valve of preferential freeing of page is low.When user switches the page, to buffer area query hit to the corresponding buffer memory page, directly access the buffer memory page of this hit.
The embodiment of the present invention is by calculating the buffer memory page of the priority valve determination buffer area of the current accessed page obtained, the page that priority cache priority valve is higher, when system cache inadequate resource, the page that preferential Release priority weights are lower, can the limited buffer area resource of efficiency utilization, improve average loading velocity and the page layout switch speed of the page, improve Consumer's Experience.
Embodiment four
The structural representation of a kind of page cache device that Fig. 4 provides for the embodiment of the present invention four, as shown in Figure 4, described device comprises:
History Access status data acquisition module 41, for obtaining the history Access status data of the terminal current accessed page;
Computing module 42, for the history Access status data according to the described current accessed page, calculates the priority valve of the described current accessed page;
Buffer memory page determination module 43, for the buffer memory page according to the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
The embodiment of the present invention is by obtaining the history Access status data of the terminal current accessed page, according to the history Access status data of the described current accessed page, calculate the priority valve of the described current accessed page, and according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.Because the history Access status data of the current accessed page reflect the custom that user opens the page, therefore can according to the difference custom buffer memory page of different user by said method, the buffer area resource that efficiency utilization is limited, improve average loading velocity and the page layout switch speed of the page, improve Consumer's Experience.
On the basis of above-described embodiment, optionally, the history Access status data of the described current accessed page comprise: the access frequency of the described current accessed page when described terminal history is accessed and average render time.
Optionally, the history Access status data of the described current accessed page can also comprise: the described current accessed page described terminal current accessed time access frequency and average render time.
Further, described computing module specifically for: using the priority valve of the product of the access frequency of the current accessed page when described terminal history is accessed and average render time as the described current accessed page;
Or using the current accessed page described terminal current accessed time access frequency and the product of average render time as the priority valve of the described current accessed page.
Further, described buffer memory page determination module comprises: priority valve comparing unit, for when the free buffer resource of described buffer area is less than the resource of the described current accessed page, the priority valve of the buffer memory page in the priority valve of the more described current accessed page and described buffer area;
Buffer memory page determining unit, for according to comparative result, discharges preferred weights in described buffer area and is less than at least one buffer memory page of described current accessed page priority valve, and the resource of the described current accessed page is added into described buffer area.
It should be noted that, when the free buffer resource of buffer area is greater than or equal to the resource of the described current accessed page, directly described current accessed page cache is entered in described buffer area.
On the basis of above-described embodiment, optionally, described device also comprises: update module, for after according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area, upgrade visit history Access status data corresponding to each caching page face of described buffer area and priority valve.
Further, described device also comprises: load-on module, for when obtaining page layout switch instruction, inquiring about described buffer area and whether being cached with target pages corresponding to described page layout switch instruction, if so, then loading the target pages of described buffer area buffer memory.
Preferably, described caching page face comprises: user interface and/or page data.
Said apparatus can perform the method that any embodiment of the present invention provides, and possesses the corresponding functional module of manner of execution and beneficial effect.
Note, above are only preferred embodiment of the present invention and institute's application technology principle.Skilled person in the art will appreciate that and the invention is not restricted to specific embodiment described here, various obvious change can be carried out for a person skilled in the art, readjust and substitute and can not protection scope of the present invention be departed from.Therefore, although be described in further detail invention has been by above embodiment, the present invention is not limited only to above embodiment, when not departing from the present invention's design, can also comprise other Equivalent embodiments more, and scope of the present invention is determined by appended right.

Claims (15)

1. a page cache method, is characterized in that, comprising:
Obtain the history Access status data of the terminal current accessed page;
According to the history Access status data of the described current accessed page, calculate the priority valve of the described current accessed page;
According to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
2. page cache method according to claim 1, is characterized in that, the history Access status data of the described current accessed page comprise:
The access frequency of the described current accessed page when described terminal history is accessed and average render time.
3. page cache method according to claim 1, is characterized in that, the history Access status data of the described current accessed page comprise: the described current accessed page described terminal current accessed time access frequency and average render time.
4. page cache method according to claim 2, is characterized in that, the described history Access status data according to the described current accessed page, calculate the priority valve of the described current accessed page, comprising:
Using the priority valve of the product of the access frequency of the current accessed page when described terminal history is accessed and average render time as the described current accessed page.
5. page cache method according to claim 1, is characterized in that, the buffer memory page of the free buffer resource determination buffer area of the described priority valve according to the described current accessed page and buffer area, comprising:
When the free buffer resource of described buffer area is less than the resource of the described current accessed page, the priority valve of the buffer memory page in the priority valve of the more described current accessed page and described buffer area;
According to comparative result, discharge preferred weights in described buffer area and be less than at least one buffer memory page of described current accessed page priority valve, and the resource of the described current accessed page is added into described buffer area.
6. page cache method according to claim 1, is characterized in that, after according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area, also comprises:
Upgrade history Access status data corresponding to each caching page face of described buffer area and priority valve.
7. page cache method according to claim 1, is characterized in that, also comprises:
When obtaining page layout switch instruction, inquiring about described buffer area and whether being cached with target pages corresponding to described page layout switch instruction, if so, then loading the target pages of described buffer area buffer memory.
8., according to described page cache method arbitrary in claim 1-7, it is characterized in that, described caching page face comprises: user interface and/or page data.
9. a page cache device, is characterized in that, comprising:
History Access status data acquisition module, for obtaining the history Access status data of the terminal current accessed page;
Computing module, for the history Access status data according to the described current accessed page, calculates the priority valve of the described current accessed page;
Buffer memory page determination module, for the buffer memory page according to the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area.
10. device according to claim 9, is characterized in that, the history Access status data of the described current accessed page comprise:
The access frequency of the described current accessed page when described terminal history is accessed and average render time.
11. devices according to claim 10, is characterized in that, the history Access status data of the described current accessed page comprise: the described current accessed page described terminal current accessed time access frequency and average render time.
12. devices according to claim 10, is characterized in that, described computing module specifically for:
Using the priority valve of the product of the access frequency of the current accessed page when described terminal history is accessed and average render time as the described current accessed page.
13. devices according to claim 9, is characterized in that, described buffer memory page determination module comprises:
Priority valve comparing unit, for when the free buffer resource of described buffer area is less than the resource of the described current accessed page, the priority valve of the buffer memory page in the priority valve of the more described current accessed page and described buffer area;
Buffer memory page determining unit, for according to comparative result, discharges preferred weights in described buffer area and is less than at least one buffer memory page of described current accessed page priority valve, and the resource of the described current accessed page is added into described buffer area.
14. devices according to claim 9, is characterized in that, also comprise:
Update module, for after according to the buffer memory page of the priority valve of the described current accessed page and the free buffer resource determination buffer area of buffer area, upgrades visit history Access status data corresponding to each caching page face of described buffer area and priority valve.
15. devices according to claim 9, is characterized in that, also comprise:
Load-on module, for when obtaining page layout switch instruction, inquiring about described buffer area and whether being cached with target pages corresponding to described page layout switch instruction, if so, then loading the target pages of described buffer area buffer memory.
CN201510867028.XA 2015-12-01 2015-12-01 A kind of page cache method and device Active CN105512251B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510867028.XA CN105512251B (en) 2015-12-01 2015-12-01 A kind of page cache method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510867028.XA CN105512251B (en) 2015-12-01 2015-12-01 A kind of page cache method and device

Publications (2)

Publication Number Publication Date
CN105512251A true CN105512251A (en) 2016-04-20
CN105512251B CN105512251B (en) 2019-09-10

Family

ID=55720233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510867028.XA Active CN105512251B (en) 2015-12-01 2015-12-01 A kind of page cache method and device

Country Status (1)

Country Link
CN (1) CN105512251B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106383878A (en) * 2016-09-12 2017-02-08 杭州迪普科技有限公司 Method and device for generating page menu cache files
CN106557434A (en) * 2016-10-28 2017-04-05 武汉斗鱼网络科技有限公司 A kind of interface caching method and system
CN107229397A (en) * 2017-06-08 2017-10-03 惠州Tcl移动通信有限公司 A kind of method, system, terminal and storage device for improving terminal fluency
CN108989380A (en) * 2018-05-23 2018-12-11 西安万像电子科技有限公司 Image data transfer method, apparatus and system
CN109190071A (en) * 2018-08-02 2019-01-11 浙江中农在线电子商务有限公司 Mobile terminal caching method and device
CN109543124A (en) * 2018-10-19 2019-03-29 中国平安人寿保险股份有限公司 A kind of page loading method, storage medium and server
CN109902241A (en) * 2019-02-01 2019-06-18 珠海天燕科技有限公司 A kind of loading method of resource, device and its equipment
CN110110263A (en) * 2019-05-13 2019-08-09 北京三快在线科技有限公司 Webpage display process, device, terminal and storage medium
CN112083859A (en) * 2020-09-02 2020-12-15 北京金堤征信服务有限公司 Multi-page data aggregation processing method and device
CN112612552A (en) * 2020-12-31 2021-04-06 五八有限公司 Application program resource loading method and device, electronic equipment and readable storage medium
CN112770175A (en) * 2019-11-05 2021-05-07 爱上电视传媒(北京)有限公司 Application method of EPG page in IPTV system
CN114691452A (en) * 2022-03-24 2022-07-01 北京百度网讯科技有限公司 Memory monitoring method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1585347A (en) * 2004-05-21 2005-02-23 中国科学院计算技术研究所 Network agent buffer substitution by using access characteristics of network users
CN102479249A (en) * 2010-11-26 2012-05-30 中国科学院声学研究所 Method for eliminating cache data of memory of embedded browser
CN102821113A (en) * 2011-06-07 2012-12-12 阿里巴巴集团控股有限公司 Cache method and system
CN103729438A (en) * 2013-12-30 2014-04-16 优视科技有限公司 Webpage preloading method and device
CN104731980A (en) * 2015-04-17 2015-06-24 吉林大学 Method and device for management of cache pages

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1585347A (en) * 2004-05-21 2005-02-23 中国科学院计算技术研究所 Network agent buffer substitution by using access characteristics of network users
CN102479249A (en) * 2010-11-26 2012-05-30 中国科学院声学研究所 Method for eliminating cache data of memory of embedded browser
CN102821113A (en) * 2011-06-07 2012-12-12 阿里巴巴集团控股有限公司 Cache method and system
CN103729438A (en) * 2013-12-30 2014-04-16 优视科技有限公司 Webpage preloading method and device
CN104731980A (en) * 2015-04-17 2015-06-24 吉林大学 Method and device for management of cache pages

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106383878B (en) * 2016-09-12 2019-05-07 杭州迪普科技股份有限公司 A kind of generation method and device of page menus cache file
CN106383878A (en) * 2016-09-12 2017-02-08 杭州迪普科技有限公司 Method and device for generating page menu cache files
CN106557434A (en) * 2016-10-28 2017-04-05 武汉斗鱼网络科技有限公司 A kind of interface caching method and system
CN107229397A (en) * 2017-06-08 2017-10-03 惠州Tcl移动通信有限公司 A kind of method, system, terminal and storage device for improving terminal fluency
CN108989380A (en) * 2018-05-23 2018-12-11 西安万像电子科技有限公司 Image data transfer method, apparatus and system
CN108989380B (en) * 2018-05-23 2021-06-04 西安万像电子科技有限公司 Image data transmission method, device and system
CN109190071A (en) * 2018-08-02 2019-01-11 浙江中农在线电子商务有限公司 Mobile terminal caching method and device
CN109543124A (en) * 2018-10-19 2019-03-29 中国平安人寿保险股份有限公司 A kind of page loading method, storage medium and server
CN109543124B (en) * 2018-10-19 2023-07-25 中国平安人寿保险股份有限公司 Page loading method, storage medium and server
CN109902241B (en) * 2019-02-01 2020-12-25 珠海天燕科技有限公司 Resource loading method, device and equipment
CN109902241A (en) * 2019-02-01 2019-06-18 珠海天燕科技有限公司 A kind of loading method of resource, device and its equipment
CN110110263A (en) * 2019-05-13 2019-08-09 北京三快在线科技有限公司 Webpage display process, device, terminal and storage medium
CN110110263B (en) * 2019-05-13 2020-07-28 北京三快在线科技有限公司 Webpage display method, device, terminal and storage medium
CN112770175A (en) * 2019-11-05 2021-05-07 爱上电视传媒(北京)有限公司 Application method of EPG page in IPTV system
CN112083859A (en) * 2020-09-02 2020-12-15 北京金堤征信服务有限公司 Multi-page data aggregation processing method and device
CN112612552A (en) * 2020-12-31 2021-04-06 五八有限公司 Application program resource loading method and device, electronic equipment and readable storage medium
CN114691452A (en) * 2022-03-24 2022-07-01 北京百度网讯科技有限公司 Memory monitoring method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN105512251B (en) 2019-09-10

Similar Documents

Publication Publication Date Title
CN105512251A (en) Page cache method and device
US11586451B2 (en) Resource management with dynamic resource policies
CN106886570B (en) Page processing method and device
CN111159436B (en) Method, device and computing equipment for recommending multimedia content
JP2019514110A (en) Realizing Load Address Prediction Using Address Prediction Table Based on Load Path History in Processor Based System
US8719486B2 (en) Pinning content in nonvolatile memory
US9585049B2 (en) Method for multipath scheduling based on a lookup table
US10862992B2 (en) Resource cache management method and system and apparatus
JP2020504865A (en) Application data processing method, apparatus, and storage medium
US9798827B2 (en) Methods and devices for preloading webpages
CN102750174A (en) Method and device for loading file
CN104808952A (en) Data caching method and device
CN108701079A (en) The system and method that flash memory with adaptive prefetching reads cache
CN108345478B (en) Application processing method and device, storage medium and electronic equipment
CN106557436A (en) The memory compression function enabled method of terminal and device
WO2020220971A1 (en) File loading method and apparatus, electronic device, and storage medium
WO2014175912A2 (en) Dirty data management for hybrid drives
CN104866339A (en) Distributed persistent management method, system and device of FOTA data
CN112631504A (en) Method and device for realizing local cache by using off-heap memory
CN109685712A (en) Image buffer storage and application method and device, terminal
Ahn et al. A compressed file system manager for flash memory based consumer electronics devices
CN108681469B (en) Page caching method, device, equipment and storage medium based on Android system
CN102902735B (en) A kind of IPTV IPTV searches for caching method and system
CN113010551B (en) Resource caching method and device
JPWO2009025066A1 (en) Display control device, display data server, and display control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant