CN106681990B - Data cached forecasting method under a kind of mobile cloud storage environment - Google Patents

Data cached forecasting method under a kind of mobile cloud storage environment Download PDF

Info

Publication number
CN106681990B
CN106681990B CN201510744409.9A CN201510744409A CN106681990B CN 106681990 B CN106681990 B CN 106681990B CN 201510744409 A CN201510744409 A CN 201510744409A CN 106681990 B CN106681990 B CN 106681990B
Authority
CN
China
Prior art keywords
file
user
subsequent
prefetched
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510744409.9A
Other languages
Chinese (zh)
Other versions
CN106681990A (en
Inventor
周可
王桦
廖正霜
范瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201510744409.9A priority Critical patent/CN106681990B/en
Publication of CN106681990A publication Critical patent/CN106681990A/en
Application granted granted Critical
Publication of CN106681990B publication Critical patent/CN106681990B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/957Browsing optimisation, e.g. caching or content distillation
    • G06F16/9574Browsing optimisation, e.g. caching or content distillation of access to content, e.g. by caching

Abstract

The invention discloses forecasting methods data cached under a kind of cloud storage environment.User accesses data when using mobile terminal would generally have the characteristics that sequence slides screens switch; so that user usually has the access characteristics based on tab sequential when accessing file data; therefore the data pre-fetching strategy based on tab sequential can be used to improve the validity of data pre-fetching, to improve buffer efficiency.Data pre-fetching strategy based on tab sequential is firstly the need of record user to the history access record of each file, therefore there is the problem of cold start-up in initial use, at the beginning, first prefetched with prefetching length initially as 1, then according to prefetch file whether be accessed to determination prefetch length and prefetched, after having history access record, determine whether to be prefetched by accessing it based on the probability of the subsequent file of list after calculating access current file.The present invention can lead to too small amount of flow cost to exchange the biggish reduction for waiting delay time for.

Description

Data cached forecasting method under a kind of mobile cloud storage environment
Technical field
The invention belongs to cloud storage fields, prefetch more particularly, to data cached under a kind of mobile cloud storage environment Method.
Background technique
With the development of mobile internet, terminal applies show explosive growth, and mobile terminal has been increasingly becoming newly Application platform, user is also higher and higher to the online shared equal requirements of the memory space of mobile terminal and terminal resource, Under the promotion of this demand, a kind of trend has been increasingly becoming using cloud storage service by terminal device.However mobile Internet Have the characteristics that high latency, low rate and unstable relative to cable broadband internet, cloud is used under mobile internet environment The experience of storage service greatly differs from each other compared in the experience under high-speed wired broadband internet environment, therefore, it is possible to use slow Technology is deposited to improve the performance for using cloud storage service under mobile internet environment.
One of an important factor for data pre-fetching is influence buffer efficiency, user are using cloud storage service by mobile terminal When, personal data are stored in cloud storage system, when accessing data, in the case where there is caching, have very big probability can Data are got from caching, however in the case where no caching, then it needs through mobile Internet into cloud storage system It goes to obtain, which adds waiting delays when user accesses data.Cache prefetching is according to the access record before user Next file data that situation prediction user may access, and these data are got in local cache in advance, in this way Waiting delay when can not only improve cache hit rate, while can be reduced user accesses data.Data pre-fetching is not non-in caching It goes to obtain data in memory when hit, but considers this non-hit situation in advance, in advance by data acquisition to caching In.The validity that data pre-fetching will be directly influenced when data pre-fetching to the forecasting accuracy of the data to be accessed in the future, goes forward side by side And the performance of entire storage system is influenced, existing prefetching algorithm will lead to many data prefetched can't be in a short time Accessed, this does not achieve the purpose that not only to prefetch, and can also waste the network bandwidth of user instead.
Summary of the invention
Aiming at the above defects or improvement requirements of the prior art, the present invention provides cache under a kind of mobile cloud storage environment The forecasting method of data, it is intended that solving to will lead to many data prefetched present in existing prefetching algorithm can't It is accessed in a short time, thus the technical issues of the network bandwidth of user can be wasted.
To achieve the above object, according to one aspect of the present invention, it provides and caches number under a kind of mobile cloud storage environment According to forecasting method, be that the described method comprises the following steps using in the terminal:
(1) user's request is received, is requested according to the user to server demand file list, and by this document list display To user;
(2) judge that the file of user's selection whether there is in mobile terminal after select file in listed files in user (3) in caching, are then entered step if it exists, otherwise enter step (4);
(3) file that user's selection is extracted directly from the caching of mobile terminal, subsequently into step (6);
(4) HTTP request is sent to server, URL corresponding with the file of user's selection is carried in the HTTP request and is believed Breath;
(5) file corresponding with the URL information is received from server, this document is exactly the file of user's selection;
(6) judge that the file of user's selection with the presence or absence of there is history access record, is transferred to step (7), such as if not Fruit has, and is transferred to step (10);
(7) counter n=1 is set;
(8) n subsequent files of user's select file are prefetched;
(9) judge next whether the subsequent file was accessed by the user, the subsequent file number n that will if it is prefetch Add 1, then proceed to repeat step (8), until all subsequent files have all been prefetched or user stops access this document Until, otherwise return step (7);
(10) determine whether to prefetch the subsequent file by the probability of the subsequent file of calculating access this document, And determination prefetches how many file, wherein being greater than in the probability product for accessing subsequent fileWhen, then in advance from server The subsequent file is prefetched to locally.
Preferably, HTTP request includes the tool of the address of server, the unique identifier of user and user's select file Body path.
Preferably, step (10) includes following sub-step:
It is 0 that (10-1) setting, which prefetches queue length m,;
The probability of its subsequent file B is accessed the file A that (10-2) calculating user selects next time afterWherein FAB Indicate the total degree for accessing the subsequent file B of A in history access record after access file A, FAIndicate right in history access record The total degree of file A access;
(10-3) is judged whether there isIf so, then illustrating have 50% or more probability that can access A after accessing file A Subsequent file B, by B addition prefetch queue, prefetch queue length m and add 1, then go to step (10-4), otherwise return step (10-1);
The Probability p 2 of its subsequent file C, the same step of method (10-2) are accessed after the subsequent file B of (10-4) calculating;
(10-5) is judged whether there isIf then illustrating have 50% or more probability that can be based on after accessing file A Listed files accesses the subsequent file B and C of A, and C is also added and prefetches queue, will prefetch length m and adds 1, and goes to step (10- 6), otherwise it is transferred to step (10-7);
(10-6) repeats above-mentioned steps (10-5), until p1*p2*...*pm-1> 1/2 and p1*p2*...*pm-1*pm≤ Until 1/2, prefetching queue length at this time is m-1;
(10-7) successively prefetches the file prefetched in queue;
Next whether the All Files that (10-8) judgement is prefetched to are accessed in order by user, if then explanation prefetches just Really, process terminates, and otherwise explanation prefetches wrong, then return step (10-1).
It is another aspect of this invention to provide that pre-fetching system data cached under providing a kind of mobile cloud storage environment, Be be arranged in the terminal, the system comprises:
First module is requested according to the user to server demand file list, and for receiving user's request by this article Part list display is to user;
Second module, for user judge after select file in listed files user selection file whether there is in In the caching of mobile terminal, then enter third module if it exists, otherwise enters the 4th module;
Third module, for extracting the file of user's selection directly from the caching of mobile terminal, subsequently into the 6th mould Block;
4th module carries the file with user's selection for sending HTTP request to server in the HTTP request Corresponding URL information;
5th module, for receiving file corresponding with the URL information from server, this document is exactly the text of user's selection Part;
6th module, the file for judging user's selection are transferred to if not with the presence or absence of there is history access record 7th module, if there is being then transferred to the tenth module;
7th module, for counter n=1 to be arranged;
8th module, for prefetching n subsequent files of user's select file;
9th module, for judging next whether the subsequent file was accessed by the user, after if it is prefetching Add 1 after file number n, then proceed to repeat the 8th module, until all subsequent files have all been prefetched or user stops Until only accessing this document, the 7th module is otherwise returned;
Tenth module determines whether for the probability by the subsequent file for calculating access this document to the subsequent file It is prefetched, and determination prefetches how many file, wherein being greater than in the probability product for accessing subsequent fileWhen, then shift to an earlier date The subsequent file is prefetched to local from server.
In general, through the invention it is contemplated above technical scheme is compared with the prior art, can obtain down and show Beneficial effect:
(1) precision prefetched is high, and can guarantee that the data prefetched can be accessed by the user in a short time: due to using Data pre-fetching strategy defined in step (7)-(9) and (10) based on list is effectively utilized user and accesses listed files Temporal locality and spatial locality, therefore ensure that higher hit rate, the precision prefetched is higher;
(2) access delay is smaller, due to the data content that active predicting user of the present invention may have access in the future, and by its It gets in local cache, is effectively reduced the response time of user's request, shorten access delay.
Detailed description of the invention
Fig. 1 is the flow chart of forecasting method data cached under the mobile cloud storage environment of the present invention.
Fig. 2 is mobile device cloud storage client file menu list display schematic diagram.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right The present invention is further elaborated.It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, and It is not used in the restriction present invention.As long as in addition, technical characteristic involved in the various embodiments of the present invention described below Not constituting a conflict with each other can be combined with each other.
For mobile device, since its screen is smaller, there is certain limitation in data presentation, most often Data presentation mode is exactly to be presented to the user in the form of menu list, and each single item of menu list represents one of user File, when user selects a certain in menu list, which indicates request browsing this document, literary when browse this document Part content will take whole mobile phone or the screen of pad, when user will switch file, it is necessary to exit current browsing shape State, the File menu list before then returning to, selects next file to be browsed, therefore, in this case, now Intelligent mobile equipment all devise the function of sliding, user can be by up and down or the screen that horizontally slips is come phase on switch logic Adjacent file, and usually user when store personal document's data in order to more convenient inquiry and browsing, can general File carries out classification storage, this makes more useful to switch browsing file by sliding screen.It is this clear for mobile device Look at file data when peculiar feature, effective data cached prefetch strategy the present invention is intended to provide a kind of.
Technical term of the invention is explained and illustrated first below:
Subsequent file: certain file is based on next file in list logical order.
It prefetches length: obtaining the quantity of documents of file from server end in advance, for example prefetch length and show to obtain file for 1 Obtain the subsequent file B of A while A from server end in advance;It prefetches length and shows to prefetch thereafter while obtaining file A for 2 After the subsequent file C of file B and B.
The invention proposes forecasting method data cached under a kind of mobile cloud storage environment, it is used for this with peculiar Access file data feature mobile device under more efficient acquisition data, improve buffer efficiency, and thereby promote user's body It tests.
Lead to according to the data access feature in mobile device, when one can consider that browsing accessing file on intelligent devices Often with there is certain logical order, i.e. user is likely to access file according to the document order in list, so I Can be prefetched according to this adjacent file in tab sequential, we be known as the sequence based on list prefetch.
It is prefetched in strategy this, needs to record the history access record of user, and calculate according to the history access record Its subsequent accessed probability based on tab sequential after each file is accessed, and prefetched according to probability value.
When initial, due to there is no history access record, but according to data access features, it is assumed that user can sequence visit Ask, set prefetch length (prefetching number of files) as 1 first, be then based on tab sequential prefetch the subsequent file of current file to this Ground caching will prefetch length and add 1 to be set as 2, be then based on tab sequential to server end if the file next prefetched is accessed Latter two file data is prefetched, is otherwise 1 to be prefetched also according to length is prefetched;By prefetch length be 2 in the case where into Row is when prefetching, if next the two files prefetched of sequential access arrive then length next will be prefetched adds 1 again really Server end based on three files after the prefetching of tab sequential into local cache, if but the file that there is one of them to prefetch do not have It is accessed sequentially, then will prefetch length after access is currently not the file accessed in order and be set as 1 again, then carry out It prefetches, repeats above step, until access terminates.
After mobile client is used for multiple times in user, many history access records are just had, when later access has record These files when can calculate probability to determine whether to prefetch to based on subsequent in tab sequential.Note is worked as The file of preceding access is A, and the subsequent file of the A based on tab sequential is B, FAFor what is accessed in history access record file A Total degree, FABFor the total degree of the subsequent file B of access A after access file A in history access record, P (AB) is current accessed The probability of the subsequent file B of A is accessed after file A, then
WhenWhen, illustrate have 50% or more probability can be based on the access A of sequence list after accessing file A Subsequent file B, at this moment we can prefetch file B, with initially prefetch strategy it is similar, maximum is prefetched length by us Be set as 2, after determining that B can be prefetched, we referring again to the B based on tab sequential subsequent file C whether can and text Part B is prefetched together, if current accessed file is A, then the probability for next accessing file B and C in order is P (AB) P (BC), whenWhen, illustrate have 50% or more probability can be based on the visit of sequence list after accessing file A Ask the subsequent file B and C of A, therefore we select to prefetch B and C together, otherwise only prefetch file B, repeat above step into Row calculates, and after some file is added and is prefetched, until probability product is less than or equal to 1/2, just determines at this time finally Prefetch queue.
In prefetching strategy, the setting for prefetching data length is also needed in view of the following aspects:
(1) it if prefetching length is directly set as 1, i.e., prefetches every time and all only obtains a file from server-side, then user is every It accesses a file to require to server-side to go to request once to obtain to prefetch file data, makes the number mistake of user access server It is more, more network flows are consumed, on the other hand, and if user's access file fast speed, such as simple browsing pictures, that User cracking can switch over, and the file prefetched in user's switching is possible to if prefetching a file every time not yet It is fully retrieved in local cache, therefore, the case where still having access delay.
(2) if prefetching, length is excessive, due to being not absolutely to determine whether the subsequent of file of current accessed connects Getting off can be accessed, therefore the file prefetched may can't be accessed, and the file prefetched is more, then the user wasted Flow is also more, on the other hand, since spatial cache is limited, prefetches more data and means to swap out and deposited in caching originally Data, be possible to may be useless by the data that the data that user's frequentation is asked are replaced out, and prefetched in this way, Reduce the performance of system instead in this way.
We can find out a proper data pre-fetching length by experiment during realization taking into account the above, To in the case where the number and (1) that reduce user's access server-side middle access delay introduced, moreover it is possible to as few as possible Invalid data is prefetched, saves customer flow, and improve user experience.
The present invention describe improved on mobile terminals using caching technology mobile terminal access cloud storage service when, use The efficiency of caching is further increased based on the data pre-fetching strategy of tab sequential.
As shown in Figure 1, forecasting method data cached under the mobile cloud storage environment of the present invention, is applied in mobile terminal In, this method comprises the following steps:
(1) user's request is received, is requested according to the user to server demand file list, and by this document list display To user, as shown in Figure 2;
(2) judge that the file of user's selection whether there is in mobile terminal after select file in listed files in user (3) in caching, are then entered step if it exists, otherwise enter step (4);
(3) file that user's selection is extracted directly from the caching of mobile terminal, subsequently into step (6);
(4) HTTP request is sent to server, URL corresponding with the file of user's selection is carried in the HTTP request and is believed Breath;Specifically, the HTTP request includes the tool of the address of server, the unique identifier of user and user's select file Body path;
(5) file corresponding with the URL information is received from server, this document is exactly the file of user's selection;
(6) judge that the file of user's selection with the presence or absence of there is history access record, is transferred to step (7), such as if not Fruit has, and is transferred to step (10);
(7) counter n=1 is set;
(8) n subsequent files of user's select file are prefetched;
(9) judge next whether the subsequent file was accessed by the user, the subsequent file number n that will if it is prefetch Add 1, then proceed to repeat step (8), until all subsequent files have all been prefetched or user stops access this document Until, otherwise return step (7);
(10) determine whether to prefetch the subsequent file by the probability of the subsequent file of calculating access this document, And determination prefetches how many file, wherein being greater than in the probability product for accessing subsequent fileWhen, then in advance from server The subsequent file is prefetched to locally, this step specifically includes following sub-step:
It is 0 that (10-1) setting, which prefetches queue length m,;
The probability of its subsequent file B is accessed the file A that (10-2) calculating user selects next time afterWherein FAB Indicate the total degree for accessing the subsequent file B of A in history access record after access file A, FAIndicate right in history access record The total degree of file A access;
(10-3) is judged whether there isIf so, then illustrating have 50% or more probability that can access A after accessing file A Subsequent file B, by B addition prefetch queue, prefetch queue length m and add 1, then go to step (10-4), otherwise return step (10-1);
The Probability p 2 of its subsequent file C, the same step of method (10-2) are accessed after the subsequent file B of (10-4) calculating;
(10-5) is judged whether there isIf then illustrating have 50% or more probability that can be based on after accessing file A Listed files accesses the subsequent file B and C of A, and C is also added and prefetches queue, will prefetch length m and adds 1, and goes to step (10- 6), otherwise it is transferred to step (10-7);
(10-6) repeats above-mentioned steps (10-5), until p1*p2*...*pm-1> 1/2 and p1*p2*...*pm-1*pm≤ Until 1/2, prefetching queue length at this time is m-1;
(10-7) successively prefetches the file prefetched in queue;
Next whether the All Files that (10-8) judgement is prefetched to are accessed in order by user, if then explanation prefetches just Really, process terminates, and otherwise explanation prefetches wrong, then return step (10-1).
The above method through the invention, spy when for user in mobile terminal using client access cloud storage data Property, access delay can be reduced using this data prefetching method based on tab sequential, effectively improve entire caching system Efficiency.
As it will be easily appreciated by one skilled in the art that the foregoing is merely illustrative of the preferred embodiments of the present invention, not to The limitation present invention, any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should all include Within protection scope of the present invention.

Claims (4)

1. data cached forecasting method under a kind of mobile cloud storage environment is using in the terminal, which is characterized in that It the described method comprises the following steps:
(1) user's request is received, is requested according to the user to server demand file list, and by this document list display to use Family;
(2) judge that the file of user's selection whether there is in the caching of mobile terminal after select file in listed files in user In, (3) are then entered step if it exists, otherwise enter step (4);
(3) file that user's selection is extracted directly from the caching of mobile terminal, subsequently into step (6);
(4) HTTP request is sent to server, URL information corresponding with the file of user's selection is carried in the HTTP request;
(5) file corresponding with the URL information is received from server, this document is exactly the file of user's selection;
(6) judge that the file of user's selection with the presence or absence of there is history access record, is transferred to step (7) if not, if there is Then it is transferred to step (10);
(7) counter n=1 is set;
(8) n subsequent files of user's select file are prefetched;
(9) judge next whether the subsequent file was accessed by the user, the subsequent file number n prefetched if it is added 1, It then proceedes to repeat step (8), until all subsequent files have all been prefetched or user stops access this document, Otherwise return step (7);
(10) probability of the subsequent file of access this document is calculated, and is based on probability, it is determined whether the subsequent file is carried out pre- The number of file is taken and prefetches, specifically, if the product of the probability of n subsequent files of access is greater thanAnd after accessing n+1 Product after the probability of file is less than or equal toThen n subsequent files are prefetched to local from server in advance.
2. forecasting method according to claim 1, which is characterized in that HTTP request includes the address of server, user The specific path of unique identifier and user's select file.
3. forecasting method according to claim 1, which is characterized in that step (10) includes following sub-step:
It is 0 that (10-1) setting, which prefetches queue length m,;
The probability of its subsequent file B is accessed the file A that (10-2) calculating user selects next time afterWherein FABIt indicates The total degree of the subsequent file B of A, F are accessed in history access record after access file AAIt indicates in history access record to file A The total degree of access;
(10-3) is judged whether there isIf so, then illustrating have 50% or more probability that can access the subsequent of A after accessing file A B addition is prefetched queue, prefetches queue length m and add 1, then go to step (10-4), otherwise return step (10-1) by file B;
The Probability p 2 of its subsequent file C, the same step of method (10-2) are accessed after the subsequent file B of (10-4) calculating;
(10-5) is judged whether there isIf then illustrating have 50% or more probability that can be based on file after accessing file A List accesses the subsequent file B and C of A, and C is also added and prefetches queue, will prefetch length m and adds 1, and goes to step (10-6), no Then it is transferred to step (10-7);
(10-6) repeats above-mentioned steps (10-5), until p1*p2*...*pm-1> 1/2 and p1*p2*...*pm-1*pm≤ 1/2 Until, prefetching queue length at this time is m-1;
(10-7) successively prefetches the file prefetched in queue;
Next whether the All Files that (10-8) judgement is prefetched to are accessed in order by user, if then explanation prefetches correctly, Process terminates, and otherwise explanation prefetches wrong, then return step (10-1).
4. data cached pre-fetching system, is to be arranged in the terminal under a kind of mobile cloud storage environment, which is characterized in that The system comprises:
First module is requested according to the user to server demand file list, and this document is arranged for receiving user's request Table is shown to user;
Second module, for judging that the file of user's selection whether there is in movement after select file in listed files in user In the caching of terminal, then enter third module if it exists, otherwise enters the 4th module;
Third module, for extracting the file of user's selection directly from the caching of mobile terminal, subsequently into the 6th module;
4th module carries corresponding with the file of user's selection for sending HTTP request to server in the HTTP request URL information;
5th module, for receiving file corresponding with the URL information from server, this document is exactly the file of user's selection;
6th module, the file for judging user's selection are transferred to the 7th with the presence or absence of there is history access record if not Module, if there is being then transferred to the tenth module;
7th module, for counter n=1 to be arranged;
8th module, for prefetching n subsequent files of user's select file;
9th module, for judging next whether the subsequent file was accessed by the user, the subsequent text that will if it is prefetch Part number n adds 1, then proceedes to repeat the 8th module, until all subsequent files have all been prefetched or user stops visiting Until asking this document, the 7th module is otherwise returned;
Tenth module, the probability of the subsequent file for calculating access this document, and it is based on probability, it is determined whether to the subsequent text Part is prefetched and is prefetched the number of file, specifically, if the product of the probability of n subsequent files of access is greater thanAnd it visits Ask that the product of the probability of n+1 subsequent files is less than or equal toThen n subsequent files are prefetched to local from server in advance.
CN201510744409.9A 2015-11-05 2015-11-05 Data cached forecasting method under a kind of mobile cloud storage environment Active CN106681990B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510744409.9A CN106681990B (en) 2015-11-05 2015-11-05 Data cached forecasting method under a kind of mobile cloud storage environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510744409.9A CN106681990B (en) 2015-11-05 2015-11-05 Data cached forecasting method under a kind of mobile cloud storage environment

Publications (2)

Publication Number Publication Date
CN106681990A CN106681990A (en) 2017-05-17
CN106681990B true CN106681990B (en) 2019-10-25

Family

ID=58857198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510744409.9A Active CN106681990B (en) 2015-11-05 2015-11-05 Data cached forecasting method under a kind of mobile cloud storage environment

Country Status (1)

Country Link
CN (1) CN106681990B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107294990B (en) * 2017-07-04 2020-06-26 中国联合网络通信集团有限公司 Information encryption method and device
CN110018970B (en) * 2018-01-08 2023-07-21 腾讯科技(深圳)有限公司 Cache prefetching method, device, equipment and computer readable storage medium
CN108667916B (en) * 2018-04-24 2021-08-13 百度在线网络技术(北京)有限公司 Data access method and system for Web application
CN108763104B (en) * 2018-05-23 2022-04-08 北京小米移动软件有限公司 Method and device for pre-reading file page and storage medium
CN109698865A (en) * 2018-12-26 2019-04-30 苏州博纳讯动软件有限公司 A kind of cloud application caching method based on access prediction
CN110245094B (en) * 2019-06-18 2020-12-29 华中科技大学 Block-level cache prefetching optimization method and system based on deep learning
CN111818122B (en) * 2020-05-28 2022-03-01 北京航空航天大学 Flow fairness-based wide area network data prefetching method
CN114968076A (en) * 2021-02-25 2022-08-30 华为技术有限公司 Method, apparatus, medium, and program product for storage management

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183908A (en) * 2007-12-24 2008-05-21 深圳市茁壮网络技术有限公司 Data prefetching method and communication system and related device
CN102737037A (en) * 2011-04-07 2012-10-17 北京搜狗科技发展有限公司 Webpage pre-reading method, device and browser
US8380680B2 (en) * 2010-06-23 2013-02-19 International Business Machines Corporation Piecemeal list prefetch
CN104580437A (en) * 2014-12-30 2015-04-29 创新科存储技术(深圳)有限公司 Cloud storage client and high-efficiency data access method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101183908A (en) * 2007-12-24 2008-05-21 深圳市茁壮网络技术有限公司 Data prefetching method and communication system and related device
US8380680B2 (en) * 2010-06-23 2013-02-19 International Business Machines Corporation Piecemeal list prefetch
CN102737037A (en) * 2011-04-07 2012-10-17 北京搜狗科技发展有限公司 Webpage pre-reading method, device and browser
CN104580437A (en) * 2014-12-30 2015-04-29 创新科存储技术(深圳)有限公司 Cloud storage client and high-efficiency data access method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
移动环境下支持实时事务处理的数据预取;李国徽等;《计算机学报》;20081015(第10期);第1841-1845页 *

Also Published As

Publication number Publication date
CN106681990A (en) 2017-05-17

Similar Documents

Publication Publication Date Title
CN106681990B (en) Data cached forecasting method under a kind of mobile cloud storage environment
US10182127B2 (en) Application-driven CDN pre-caching
CN106161569B (en) Recommendation, buffer replacing method and the equipment of Web content
CN105210352B (en) Intelligent content based on fingerprint prefetches
KR101728927B1 (en) Pre-fetching content items based on social distance
CN101741986B (en) Page cache method for mobile communication equipment terminal
CN106599239A (en) Webpage content data acquisition method and server
US20080222242A1 (en) Method and System for Improving User Experience While Browsing
US20160063577A1 (en) Handling of real-time advertisement with content prefetching
WO2008103639A1 (en) System and method for preloading content on the basis of user context
CN107197359B (en) Video file caching method and device
JP2007510224A (en) A method for determining the segment priority of multimedia content in proxy cache
JP2001265641A (en) System and method for intellectual fetch and delivery of web content
CN1234086C (en) System and method for high speed buffer storage file information
US9922006B1 (en) Conditional promotion through metadata-based priority hinting
CN106649313A (en) Method and equipment for processing cache data
US10341454B2 (en) Video and media content delivery network storage in elastic clouds
US9607322B1 (en) Conditional promotion in content delivery
US7571446B2 (en) Server, computer system, object management method, server control method, computer program
CN110020290A (en) Web page resources caching method, device, storage medium and electronic device
KR101132220B1 (en) Method, system and computer-readable recording medium for providing web page using cache
US9626344B1 (en) Conditional promotion through packet reordering
CN112887376B (en) feed stream de-duplication method, device, system, computer equipment and storage medium
Pons Web-application centric object prefetching
Hussain et al. Intelligent prefetching at a proxy server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant