CN108810139A - A kind of wireless caching method based on Monte Carlo tree search auxiliary - Google Patents

A kind of wireless caching method based on Monte Carlo tree search auxiliary Download PDF

Info

Publication number
CN108810139A
CN108810139A CN201810599991.8A CN201810599991A CN108810139A CN 108810139 A CN108810139 A CN 108810139A CN 201810599991 A CN201810599991 A CN 201810599991A CN 108810139 A CN108810139 A CN 108810139A
Authority
CN
China
Prior art keywords
user
node
file
tree
monte carlo
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810599991.8A
Other languages
Chinese (zh)
Other versions
CN108810139B (en
Inventor
高鹏宇
杜洋
董彬虹
祝武勇
崔亚迪
陈特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201810599991.8A priority Critical patent/CN108810139B/en
Publication of CN108810139A publication Critical patent/CN108810139A/en
Application granted granted Critical
Publication of CN108810139B publication Critical patent/CN108810139B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching
    • H04L67/5681Pre-fetching or pre-delivering data based on network characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Data Mining & Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The invention discloses a kind of wireless caching methods based on Monte Carlo tree search auxiliary, belong to mobile communication field, relate generally in mobile communication base station in the method for wireless network idle buffered in advance nearby users desired content.In order to solve problem above, it is specially a kind of wireless caching method of the multi-arm fruit machine model from the context progress on-line study user preferences using Monte Carlo tree search auxiliary that the present invention, which proposes this method,.This method can be according to the contextual feature of user, and on-line study current time user is to the fancy grade of file, the i.e. popularity degree of file.Meanwhile the method based on the search of Monte Carlo tree can be under the ever-expanding practical communication background of video file scale, by it, efficiently data processing mode brings good caching performance.Further, since the present invention considers user's context feature and file characteristic simultaneously, and clustering processing is carried out respectively to it, cold start-up problem can effectively be contained.

Description

A kind of wireless caching method based on Monte Carlo tree search auxiliary
Technical field
The invention belongs to mobile communication field, it is attached in wireless network idle buffered in advance to relate generally to base station in mobile communication The method of nearly user demand content.This method is specially a kind of multi-arm from the context gambling based on Monte Carlo tree search auxiliary Rich machine (Monte-Carlo tree search aided contextual multi-armed bandit, MCTS-CMAB) Wireless caching method.
Background technology
In recent years, as the mobile device (such as smart mobile phone, tablet computer etc.) with multimedia function is gradually universal, newly The wireless service application of type also emerges in multitude, such as wechat, video, Taobao, microblogging etc..This make the functions of wireless mobile communications by Initial call has penetrated into amusement, office, the every aspects such as social field.At the same time, this has also promoted in wireless network The rapidly growth of middle data traffic.
The explosive growth of mobile data flow is a huge burden to existing cellular, especially In the peak period of communication, the situations such as delay, interruption are susceptible to, user experience is caused to be deteriorated.Meanwhile some researches show that not Come in mobile data flow, mobile video flow will account for very big proportion.Therefore, characteristic and hard-disc storage based on video itself Reality, there is scholar to propose a kind of entitled solution wirelessly cached, basic thought is configured at wireless access point The memory of large capacity, the memory using off-peak period (such as night) by welcome video buffered in advance at access point In.In this way, user is when asking video file, if having demand file, wireless access point can be direct in caching File is transferred to user, is made flow localized.Data can not only be substantially reduced in this way in backhaul link and core network The load of backhaul link and core network when postponing, and also reducing peak period.Meanwhile this reduces backhaul link capacities Occupancy, more Internet resources can be discharged for other business services, to improve the handling capacity of system indirectly.
In order to improve the probability for finding interested video file and Successful transmissions in the nigh terminal buffers of user, one A good cache policy, which seems, to be even more important, that is, determine those popular files this by terminal buffered in advance.In existing caching skill In art, equiprobability random cache (Equal Probability Random Caching, EPRC) and clean cut system random cache plan Slightly (Cut-off Random Caching, CTRC) is most common two schemes.In equiprobability random cache, All Files All with identical probability by user's random cache;And in clean cut system random cache strategy, by clipping one in library Point request lower file of probability, forms the candidate subfile library of a caching, user can in this document library random cache file, Cache hit rate is also superior to equiprobability random cache.
Nevertheless, both buffering schemes can not be also used in systems in practice.Mainly there is following reason:1. Assume that the popularity degree of file obeys certain fixed distribution (usually considering Zipf distributions) in above-mentioned caching method.And in reality In the communication of border, the popularity degree of file should constantly change at any time.What is more important, user preferences and file prevalence journey Relationship between degree is inseparable, but original buffering scheme is not directed to.2. not considering the contextual feature of user Such as age, gender (Context),.The popularity degree of file should be in close relations with the hobby of user.Possess different characteristic User is necessarily different to the hobby of file.3. not considering file characteristic (Content Feature), such as comedy, text Skill piece etc..Nowadays the quantity of documents in network is growing day by day, if only individually analyzing each file, is delayed using current It deposits the so huge data volume of method processing and does not meet actual conditions necessarily.4. cold start-up problem (Cold Start).Due to lacking It is weary to file or user's priori the considerations of, existing caching method can not reach its own optimal property in a short time Energy.
Invention content
In order to solve problem above, it is specially that a kind of searched for using Monte Carlo tree is assisted that the present invention, which proposes this method, Multi-arm fruit machine model from the context carries out the wireless caching method of on-line study user preferences.This method can be according to user Contextual feature, on-line study current time user is to the fancy grade of file, the i.e. popularity degree of file.Meanwhile it being based on The method of Monte Carlo tree search can efficiently count under the ever-expanding practical communication background of video file scale by it Good caching performance is brought according to tupe.Further, since the present invention considers user's context feature and file simultaneously Feature, and carried out clustering processing respectively to it, cold start-up problem can effectively be contained.
In order to easily describe present disclosure, model used in the present invention is introduced first, to the present invention Used term is defined.
System model introduction:In radio coverage area, base station (Base Station, BS) is to carry out letter between terminal Cease the wireless receiving and dispatching radio station of transmission.The present invention considers reservoir of the configuration with buffer some amount file ability in a base station, Right pop file is cached.Assuming that file set is F={ f1,f2... }, and the size of All Files is identical.In view of working as The actual scene of preceding network big data, file set will at any time the time constantly expand, therefore the size of file set | F | be assumed to be It is infinitely great.The capacity of base station can be described as base station maximum can cache M file in file set.Meanwhile in order to preferably paste Nearly actual scene, the present invention consider the mobility of user, and the number of users that current time base station is serviced is indicated with N (t), wherein T=1,2 ..., T are time serial number, and T indicates end time, may also indicate that slot length.It is an object of the invention to optimize The cache file set at each moment so that user maximizes the request of the cache file at each moment.
Monte Carlo tree employed in the present invention is binary tree, and node can be expressed as (a thereoni, h, n) form, Wherein aiFor child user feature space type, that is, the label set;H is the depth of tree, and n is indicated in all nodes that depth is h, Node marked as n;It is put files into each node of Monte Carlo tree by way of file characteristic cluster, and every File characteristic in a node is not much different.
Technical solution of the present invention is a kind of wireless caching method based on Monte Carlo tree search auxiliary, and this method includes:
Step 1:According to user's context feature, feature space is divided into mTA user's subcharacter space;
Step 2:In t=1, m is initializedTBinary tree Γ, each subcharacter space correspond to a binary tree, wherein Indicate user's subcharacter space aiBinary tree,Meanwhile initializing node (ai, 1,1) and node (ai, 1,2) reward value;Wherein (ai, 0,1) and indicate user's subcharacter space aiY-bend root vertex, (ai, 1,1) and indicate user's subcharacter space aiBinary tree 1st generation in the 1st node, (ai, 1,2) and indicate user's subcharacter Space aiBinary tree 1st generation in the 2nd node;
Step 3:In t moment, all number of users N (t) in this base station are obtained, and extract the context of wherein each user Feature, wherein the contextual feature of j-th of user can be expressed as xj(t);
Step 4:According to current each user's context feature, each user is divided into corresponding user's subcharacter space;
Step 5:If j-th of user belongs to user's subcharacter space ai, then settingOn do optimum route search, obtain To the highest end leaf node of reward value of j-th of user, a path is randomly choosed when reward value is identical, by the leaf section All Files on point are by the recommendation cache file as j-th of user of t moment;Step 5 is repeated, when having traversed current Carve all users of base station;
Step 6:In the recommendation cache file of all users, when the highest file of the M frequency of occurrences being selected to be put into current Carve cache file set C;
Step 7, each user of statistics t moment request number of times from each file to cache file set C;Wherein Jth user can be expressed as d to the request number of times of the file m of cache file set Cj,m, j=1,2 ..., N (t), m= 1,2,...,M;
Step 8, for j-th of user, in its corresponding feature space aiBinary treeOn, path backtracking, more The number that the reward value of new each node and each node are utilized;Step 8 is repeated, until having traversed all users;
Step 9, to each user's subcharacter space aiCorresponding treeThe expansion for whether carrying out leaf node carries out Judge, next-generation leaf node is grown for the leaf node if the leaf node needs to expand;Step 10 is repeated, until The corresponding binary tree in all user's subcharacters space is traversed;
Step 11, return to step 3, t=t+1.
Further, the computational methods that the step 8 updates each node reward value in each node reward value are:Count the section The number being requested by a user in t moment by the file of node B cache in point, and using statistics number summation as the caching at the moment RewardThe reward for updating the node is:WhereinIndicate cut-off t moment, tree node (ai, h, n*) and the total degree that is utilized, that is, end file in the t moment node By the total degree of node B cache.
Further, the reward value of each node isIts Middle c, l10,0 < ρ < 1 of >, are constant.
Further, the judgment method that whether leaf node is expanded in the step 9 is:
Step 1:It calculates leaf node and expands thresholding
Step 2:IfAndIt is treeLeaf node, then to the leaf section Point is expanded, and is not otherwise expanded.
A kind of wireless caching method based on Monte Carlo tree search auxiliary, this method include:
Step 1:All users that will be connect with this base station according to user's context feature (user accesses the type of file) Classify;
Step 2:Go out respective binary tree according to the user's context characteristic growth of each classification, which is to one By the classified index of this base station All Files in the section time, exhaustive division index is carried out to the file more than user's access times;
Step 3:An endpoint node in its corresponding binary tree is selected for every a kind of user, includes in the node File as recommend file;The selection criteria of wherein endpoint node is:Include the click volume of file in the end segment selected Higher than the click volume for including file in other endpoint nodes;
Step 4:Cache file after the file set that all types of user is recommended is combined together as this base station.
Further, it is per the method for one kind user growth binary tree in the step 2:
Step 2.1:It is root node that the All Files transmitted by this base station in the time are used as binary tree by one section, uses File in root node is divided into two classes by the method for cluster, as two child nodes;
Step 2.2:Judge the click volume for the file for including in two child nodes of such user couple, selects one that click volume is big A child node is as growth node;
Step 2.3:The file for including in growth node that step 2.2 is selected is divided by two classes using the method for cluster, As next-generation child node, growth node is selected using the method for step 2.2 again;
Step 2.4:It is grown successively using the identical method of step 2.3, until the file quilt for including in some growth node The number that such user clicks is less than a certain threshold value.
Beneficial effects of the present invention:First, slow before efficiently solving present invention utilizes the contextual feature of user Deposit cold start-up problem existing for method;Also, Monte Carlo tree searching method of the present invention can handle net well Network big data is more in line with the demand of practical communication environment.
Description of the drawings
Fig. 1 is that user characteristics space divides schematic diagram;
Fig. 2 is binary tree structure schematic diagram in the present invention;
Fig. 3 is that file characteristic divides schematic diagram;
Fig. 4 is binary tree optimal path method schematic diagram;
Fig. 5 is that binary tree recalls update method schematic diagram;
Fig. 6 is the wireless caching method flow chart of the present invention.
Specific implementation mode
The present invention is while considering multi-arm fruit machine characteristic, it is contemplated that in binary tree, father's node is saved with its son The relationship of point, so by t moment, tree node (ai, h, n) the reward upper boundIt is defined as:As node (ai,h,n) For leaf node when,WhenWhen,EmaxWhen indicating current The maximum reward value at quarter;In the case of remaining,
The present invention is being setThe step of middle progress optimum route search, is as follows:
Step 1, initialization optimal path Path=(ai, 0,1) and current optimal path starting point (ai, h, n) and=(ai, 0,1),
Step 2, iteration judge:If starting point (a of current optimal pathi, h, n) be not leaf node andIt sets up simultaneously, executes step 3;Otherwise, step 4 is executed.
If step 3,It sets up;
The starting point of current optimal path is then updated to (ai, h, n) and=(ai, h+1,2n), and by tree node (ai,h+1, 2n) it is added in optimal path, i.e. Path=Path ∪ (ai, h+1,2n), return to step 2;IfIt sets up, then the starting point of current optimal path is updated to (ai, h, n) and=(ai,h+1,2n- 1), and by tree node (ai, h+1,2n-1) it is added in optimal path, i.e. Path=Path ∪ (ai, h+1,2n-1), return to step Rapid 2.
Starting point (a of step 4, output optimal path Path and current optimal pathi, h, n), starting point at this time is most Unique leaf node on shortest path.
In order to more clearly describe optimum route search, attached drawing 4 illustrates the progress optimal path on the binary tree of Fig. 2 and searches The process of rope.
The present invention is being setIt is middle that along optimal path, reversed newer steps are as follows:
Step 1 is being setIn find unique leaf node (a on its optimal path Path and optimal pathi,hmax, N), hmaxFor current time treeDepth capacity.Iterations Initialize installation is 1, and iteration starting point is leaf at this time Child node (ai,h,n).Maximum iteration is hmax
Step 2, when iterations be k when, more new node be (ai,h,n*), andWherein h= hmax- k indicates the depth of current more new node.The file being buffered in the node is counted in the requested number of t moment, and will Statistics number summation is rewarded as the caching at the momentIt can specifically be expressed as
Step 3, the actual average reward for updating the node:
Step 4 updates the number that the node is utilized in process of caching:
Step 5, according to define 5, update the node caching reward
Step 6, according to define 6, update the node caching reward the upper bound
Step 7, iterations k=k+1;If k > hmax, then iteration ends and terminate to treeReversely updated Process;Otherwise, step 2 is executed.
In order to more clearly describe optimum route search, attached drawing 5 is illustrated carries out backtracking update on the optimal path of Fig. 4 Process.
The thresholding η h (t) that leaf of the present invention is expanded are expressed asThe step of leaf node is expanded It is as follows:
Step 1, maximum iteration are expressed as | Λa(t)|, i.e., the quantity set in the set.Initialize iterations setting It is 1.
When step 2, iterations are i, tree is calculatedTree expand thresholding
If step 3,AndIt is treeLeaf node, then to the leaf Node is expanded, i.e. update treeStructure:
Simultaneously by nodeAnd nodeReward be set as:
Step 4, iterations update i=i+1.
If step 5, i > | Λa(t)|, then iteration ends;Otherwise, step 3 is executed.
Technical scheme of the present invention is described in detail below according to a specific embodiment.But it is above-mentioned that this should not be interpreted as to the present invention The range of main body is only limitted to following embodiment, all to be all belonged to the scope of the present invention based on the technology that the content of present invention is realized.
Data used by specific embodiments of the present invention are introduced first.The data that the present invention uses come from one The database of a entitled MoviesLens.Data source is then between 2000 to 2003, by 6040 users couple, 3952 electricity Total 1000209 evaluations that shadow carries out.The present invention regards wherein each user as each user to the evaluation of each film To the cache request of every film.
Next, according to actual conditions, the parameter initialization setting of specific embodiment is as follows in the present invention:
Slot length T is set as 8760 hours, wherein being differed 1 hour between each time slot.The contextual feature of user Only consider age and gender, is adult and teenage, male and female, i.e. the feature space Α of user respectivelyTIt is divided into mT=4 sons User characteristics space.The feature of film anticipates algorithm partition into 10 features according to enigmatic language.Base station maximum capacity M is set as 200, i.e., Maximum can cache 200 films.The largest buffered of tree node rewards Emax=∞.Three constants are respectively set to:ρ=0.5 and
It is the implementing procedure figure of institute's extracting method of the present invention as shown in Figure 6.Include the following steps:
Step 1, user's context feature space divide:By the feature space Α of userTIt is empty to be divided into 4 sub- user characteristics Between.
Step 2, binary tree Initialize installation:In t=1,4 binary tree Γ are initialized, whereinIndicate that user is special Levy space aiBinary tree,
Meanwhile initializing node (ai, 1,1) and node (ai, 1,2) reward value,
Step 3, in t moment, first observe the number of users N (t) that base station is serviced, and extract the upper of wherein each user Following traits x (t) and by its vector quantization, i.e., the contextual feature of j-th user can be expressed as xj(t),
The user's context feature that step 4, basis are extracted, each user will select the user type of oneself.
If step 5, j-th of user belong to type ai, then settingOn do optimum route search.Step 5 is repeated, directly To all users for having traversed current time base station service.
Step 6, in the recommendation cache file of all users, select the M frequency of occurrences highest file when being put into current Cache file set C is carved, C={ c can be expressed as1(t),c2(t),...,cM(t)}。
Step 7, each user of statistics t moment request number of times from each file to cache file set C.Wherein Jth user can be expressed as d to the request number of times of the file m of cache file set Cj,m, j=1,2 ..., N (t), m= 1,2,...,M。
Step 8, for j-th of user, in its corresponding feature space aiTreeOn, the reward value of node with And buffered number will carry out backtracking update along optimal path.Repeat step 8, the institute until having traversed current time base station service There is user.
Step 9, in a (t)=(ai(t)) in, i=1,2 ..., N (t), unduplicated user characteristics subspace collection is selected Close Λa(t)
Step 10, in Λa(t)In, to wherein each proper subspace aiCorresponding treeWhether leaf section is carried out The expansion of point is judged.Until having traversed proper subspace Λa(t)Upper all trees.
If step 11, t < 8760, t=t+1, and return to step 3;Otherwise, cycle is exited.

Claims (6)

1. a kind of wireless caching method based on Monte Carlo tree search auxiliary, this method include:
Step 1:According to user's context feature, feature space is divided into mTA user's subcharacter space;
Step 2:In t=1, m is initializedTBinary tree Γ, each subcharacter space correspond to a binary tree, whereinIt indicates User's subcharacter space aiBinary tree,Meanwhile initializing node (ai, And node (a 1,1)i, 1,2) reward value;Wherein (ai, 0,1) and indicate user's subcharacter space aiY-bend root vertex, (ai, 1,1) user's subcharacter space a is indicatediBinary tree 1st generation in the 1st node, (ai, 1,2) and indicate user's subcharacter space aiBinary tree 1st generation in the 2nd node;
Step 3:In t moment, all number of users N (t) in this base station are obtained, and the context for extracting wherein each user is special Sign, wherein the contextual feature of j-th of user can be expressed as xj(t);
Step 4:According to current each user's context feature, each user is divided into corresponding user's subcharacter space;
Step 5:If j-th of user belongs to user's subcharacter space ai, then settingOn do optimum route search, obtain jth The reward value highest end leaf node of a user randomly chooses a path when reward value is identical, will be on the leaf node All Files by the recommendation cache file as j-th of user of t moment;Step 5 is repeated, until having traversed current time base All users to stand;
Step 6:In the recommendation cache file of all users, the highest file of the M frequency of occurrences is selected to be put into current time slow Deposit file set C;
Step 7, each user of statistics t moment request number of times from each file to cache file set C;Wherein jth A user can be expressed as d to the request number of times of the file m of cache file set Cj,m, j=1,2 ..., N (t), m=1, 2,...,M;
Step 8, for j-th of user, in its corresponding feature space aiBinary treeOn, path backtracking, update is respectively The number that the reward value of node and each node are utilized;Step 8 is repeated, until having traversed all users;
Step 9, to each user's subcharacter space aiCorresponding treeThe expansion for whether carrying out leaf node is sentenced It is disconnected, if the leaf node needs to expand next-generation leaf node is grown for the leaf node;Repeat step 10, until time The corresponding binary tree in all user's subcharacters space is gone through;
Step 11, return to step 3, t=t+1.
2. a kind of wireless caching method based on Monte Carlo tree search auxiliary as described in claim 1, it is characterised in that institute It states step 8 and updates the computational methods of each node reward value in each node reward value and be:Count the text by node B cache in the node The number that part is requested by a user in t moment, and rewarded statistics number summation as the caching at the momentUpdate the section Point reward be:WhereinIndicate cut-off t moment, tree Node (ai, h, n*) and the total degree that is utilized, that is, end in the t moment node file by the total degree of node B cache.
3. a kind of wireless caching method based on Monte Carlo tree search auxiliary as described in claim 1, it is characterised in that institute The reward value for stating each node isWherein c, l10,0 < ρ < 1 of >, It is constant.
4. a kind of wireless caching method based on Monte Carlo tree search auxiliary as described in claim 1, it is characterised in that institute Stating the judgment method that whether leaf node is expanded in step 9 is:
Step 1:It calculates leaf node and expands thresholding
Step 2:IfAndIt is treeLeaf node, then to the leaf node into Row is expanded, and is not otherwise expanded.
5. a kind of wireless caching method based on Monte Carlo tree search auxiliary, this method include:
Step 1:All users being connect with this base station are classified according to user's context feature;
Step 2:Go out respective binary tree according to the user's context characteristic growth of each classification, when which is to one section The interior classified index by this base station All Files carries out exhaustive division index to the file more than user's access times;
Step 3:An endpoint node in its corresponding binary tree, the text for including in the node are selected for every a kind of user Part is as recommendation file;The selection criteria of wherein endpoint node is:The click volume comprising file is higher than in the end segment selected Include the click volume of file in other endpoint nodes;
Step 4:Cache file after the file set that all types of user is recommended is combined together as this base station.
6. a kind of wireless caching method based on Monte Carlo tree search auxiliary as claimed in claim 5, it is characterised in that institute It states in step 2 and is per the method for one kind user growth binary tree:
Step 2.1:It is root node that the All Files transmitted by this base station in the time are used as binary tree by one section, using cluster Method the file in root node is divided into two classes, as two child nodes;
Step 2.2:Judge the click volume for the file for including in two child nodes of such user couple, a son for selecting click volume big Node is as growth node;
Step 2.3:The file for including in growth node that step 2.2 is selected is divided by two classes using the method for cluster, as Next-generation child node selects growth node again using the method for step 2.2;
Step 2.4:It is grown successively using the identical method of step 2.3, until the file for including in some growth node is by such The number that user clicks is less than a certain threshold value.
CN201810599991.8A 2018-06-12 2018-06-12 Monte Carlo tree search-assisted wireless caching method Expired - Fee Related CN108810139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810599991.8A CN108810139B (en) 2018-06-12 2018-06-12 Monte Carlo tree search-assisted wireless caching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810599991.8A CN108810139B (en) 2018-06-12 2018-06-12 Monte Carlo tree search-assisted wireless caching method

Publications (2)

Publication Number Publication Date
CN108810139A true CN108810139A (en) 2018-11-13
CN108810139B CN108810139B (en) 2021-02-02

Family

ID=64085465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810599991.8A Expired - Fee Related CN108810139B (en) 2018-06-12 2018-06-12 Monte Carlo tree search-assisted wireless caching method

Country Status (1)

Country Link
CN (1) CN108810139B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109982389A (en) * 2019-03-05 2019-07-05 电子科技大学 A kind of wireless caching method based on multiple target multi-arm fruit machine on-line study
CN110247953A (en) * 2019-05-13 2019-09-17 电子科技大学 A kind of wireless caching method of the multiple target on-line study based on super pareto efficient allocation
CN110262879A (en) * 2019-05-17 2019-09-20 杭州电子科技大学 A kind of Monte Carlo tree searching method explored and utilized based on balance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103208041A (en) * 2012-01-12 2013-07-17 国际商业机器公司 Method And System For Monte-carlo Planning Using Contextual Information
US9497243B1 (en) * 2014-09-30 2016-11-15 Amazon Technologies, Inc. Content delivery
CN107301215A (en) * 2017-06-09 2017-10-27 北京奇艺世纪科技有限公司 A kind of search result caching method and device, searching method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103208041A (en) * 2012-01-12 2013-07-17 国际商业机器公司 Method And System For Monte-carlo Planning Using Contextual Information
US9497243B1 (en) * 2014-09-30 2016-11-15 Amazon Technologies, Inc. Content delivery
CN107301215A (en) * 2017-06-09 2017-10-27 北京奇艺世纪科技有限公司 A kind of search result caching method and device, searching method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KAIYANG GUO等: "Caching in Base Station with Recommendation via Q-Learning", 《2017 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE(WCNC)》 *
SABRINA MÜLLER等: "Context-Aware Proactive Content Caching With Service Differentiation in Wireless Networks", 《IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS》 *
戚凯强等: "内容流行分布动态性对基站端缓存性能的影响", 《信号处理》 *
胡喜: "D2D协同化流媒体服务系统设计与实现", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109982389A (en) * 2019-03-05 2019-07-05 电子科技大学 A kind of wireless caching method based on multiple target multi-arm fruit machine on-line study
CN110247953A (en) * 2019-05-13 2019-09-17 电子科技大学 A kind of wireless caching method of the multiple target on-line study based on super pareto efficient allocation
CN110247953B (en) * 2019-05-13 2022-03-15 电子科技大学 Wireless caching method for multi-target online learning based on super pareto principle
CN110262879A (en) * 2019-05-17 2019-09-20 杭州电子科技大学 A kind of Monte Carlo tree searching method explored and utilized based on balance
CN110262879B (en) * 2019-05-17 2021-08-20 杭州电子科技大学 Monte Carlo tree searching method based on balanced exploration and utilization

Also Published As

Publication number Publication date
CN108810139B (en) 2021-02-02

Similar Documents

Publication Publication Date Title
CN105979274B (en) The distributed caching laying method of dynamic self-adapting video stream media
CN107404530B (en) Social network cooperation caching method and device based on user interest similarity
CN108810139A (en) A kind of wireless caching method based on Monte Carlo tree search auxiliary
CN106230953B (en) A kind of D2D communication means and device based on distributed storage
CN105812834B (en) Video recommendations server, recommended method and pre-cache method based on clustering information
CN108667653B (en) Cluster-based cache configuration method and device in ultra-dense network
CN107295619B (en) Base station dormancy method based on user connection matrix in edge cache network
CN113115340B (en) Popularity prediction-based cache optimization method in cellular network
CN108777853B (en) Network edge caching method and system based on D2D
CN104065733A (en) Vehicle-mounted data push and paid play system based on position service
CN107734482B (en) The content distribution method unloaded based on D2D and business
CN110418367A (en) A kind of 5G forward pass mixture of networks edge cache low time delay method
Yan et al. Distributed edge caching with content recommendation in fog-rans via deep reinforcement learning
CN108521640B (en) Content distribution method in cellular network
CN103781115A (en) Distributed base station cache replacement method based on transmission cost in cellular network
CN116321307A (en) Bidirectional cache placement method based on deep reinforcement learning in non-cellular network
CN105263100A (en) Content information transmission method and device
CN106790638B (en) Name data transmission method and system based on active cache in data network
CN112887943A (en) Cache resource allocation method and system based on centrality
CN110139125B (en) Video sharing method based on demand perception and resource caching in wireless mobile network
CN107484105A (en) A kind of file multi-to-multi distribution method based on social networks
CN105447188A (en) Knowledge learning based peer-to-peer social network document retrieval method
CN106708923B (en) A kind of local cache sharing files method based on mobile collective intelligence network
CN101494833B (en) Method, device and system for sending network message
CN114245422A (en) Edge active caching method based on intelligent sharing in cluster

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210202