[summary of the invention]
In order to overcome above-mentioned technical problem, the object of the invention aims to provide a kind of data cache method, Apparatus and system, its
Solve the blast of radio data network flow and increase the network operating lag and the network transmission cost of mobile operator brought
High problem.
For solving above-mentioned technical problem, embodiment of the present invention offer techniques below scheme:
In first aspect, the embodiment of the present invention provides a kind of data cache method, and it comprises the following steps:
Receive subscriber equipment or the data of content server transmission;
Judge that described data cache the most;
If described data do not cache, then based on caching income algorithm, described data are cached.
In certain embodiments, described described data carried out caching include based on caching income algorithm:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described
The most corresponding data cached caching financial value;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum number of the most data cached middle caching financial value
According to;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue
Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In certain embodiments, described caching described data based on caching income algorithm also includes:
In the net determining described data during cache information the unknown, in the net to mantissa of the most data cached described squadron evidence
Cache information is the most known to be judged;
If cache information in the net of mantissa of the most data cached described squadron evidence it is known that, search described in the most data cached in slow
Deposit the data that financial value is minimum;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue
Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In certain embodiments, described caching described data based on caching income algorithm also includes:
If cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then delete described tail of the queue data, remaining
The most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Store the data in the first place of described queue.
In second aspect, the embodiment of the present invention also provides for a kind of data buffer storage device, including:
Receiver module, for receiving subscriber equipment or the data of content server transmission;
Judge module, is used for judging that described data cache the most;
Described data if not caching for described data, are then cached by cache module based on caching income algorithm.
In certain embodiments, described cache module includes:
First processing unit, for depositing the most data cached form with queue based on LRU
Storage;
First acquiring unit, is used for caching financial value the most data cached described in obtaining, and obtains the caching of described data
Financial value, wherein, the described the most corresponding the most data cached caching financial value;
First judging unit, in the net judging described data, cache information is the most known;
First search unit, if for described data net in cache information it is known that, search described in the most data cached in
The data that caching financial value is minimum;
First deletes unit, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number
According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
First memory element, for storing the data in the first place of described queue.
In certain embodiments, described cache module also includes:
Second judging unit, for during cache information the unknown, having cached number to described in the net determining described data
Judge according in the net of mantissa of squadron evidence, cache information is the most known;
Second search unit, if for mantissa of the most data cached described squadron evidence net in cache information it is known that, search
The data that the most data cached described middle caching financial value is minimum;
Second deletes unit, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number
According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Second memory element, for storing the data in the first place of described queue.
In certain embodiments, described cache module also includes:
3rd deletes unit, if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then deletes
Described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make the head of described queue
Position is empty;
3rd memory element, for storing the data in the first place of described queue.
In the third aspect, the embodiment of the present invention also provides for a kind of data buffering system, including:
Cache routing, core net and base station in content server, net, wherein,
Described content server send user the data asked for response concurrent;
In described net, cache routing is for receiving the data of described content server transmission and storing data, and by described number
According to sending to described core net;
Described core net is for receiving the data that described content server sends, and sends described data to described base
Stand;
Described base station is for receiving subscriber equipment or the data of content server transmission, and judges that described data are the most slow
Deposit, if described data do not cache, then based on caching income algorithm, described data are cached.
In certain embodiments, described base station carries out caching to described data include based on caching income algorithm:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described
The most corresponding data cached caching financial value;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum number of the most data cached middle caching financial value
According to;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue
Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays
Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station
Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station
Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
[detailed description of the invention]
In order to make the purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, right
The present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, not
For limiting the present invention.
As long as additionally, technical characteristic involved in each embodiment of invention described below is the most not
The conflict of composition just can be mutually combined.
In embodiments of the present invention, described LRU (Least Recently Used, LRU) is internal memory
A kind of page replacement algorithm of management, is a kind of algorithm for virtual page mobile sms service.This algorithm is based on such a
True: to use the page to be frequently likely to several the instructions below frequently use, conversely speaking, in above several instructions
It is i.e. that the page the most not used is likely to will not be used within following long period of time, therefore, only need to be
When exchanging, find that page of minimum use to recall internal memory every time.
In embodiments of the present invention, described base station (Evolved Node B, eNB) the most evolved Node B, it is long-term
The title of evolution technology (Long Term Evolution, LTE) base station, this base station is integrated with part radio network controller
The function of (Radio Network Controller, RNC), the level of agreement when decreasing communication.In the embodiment of the present invention
Wireless access network is formed by comprising this multiple eNB.
It should be noted that the embodiment of the present invention is studied based on LTE network, LTE network is drilled as system architecture
Enter the part of (System Architecture Evolution, SAE), be flat a, complete IP network, flat, complete
IP framework makes LTE network can realize content caching in the net of message class.Although it should be noted that the embodiment of the present invention
Study based on LTE network, but the scheme proposed be applicable not only to LTE network, it is also possible to it is applied to higher level
Mobile system, such as following 5G network.
The embodiment of the present invention one provides a kind of data cache method, and the process of this data buffer storage is performed by base station eNB, please
With reference to Fig. 1, this data cache method comprises the following steps:
The data that S101, reception subscriber equipment or content server send.
Wherein, the data that subscriber equipment sends are the data that user asks, and the data that content server sends are used for response
The data of family request, these data such as web page contents etc..
S102, judge that described data cache the most.
If the described data of S103 do not cache, then based on caching income algorithm, described data are cached.
In embodiments of the present invention, based on caching income algorithm, described data are carried out caching to include: based on minimum
Algorithm is used the most data cached form with queue to be stored;Caching financial value the most data cached described in acquisition, and obtain
Take the caching financial value of described data, wherein, the described the most corresponding the most data cached caching financial value;Judge described data
Net in cache information the most known;If cache information in the net of described data it is known that, search described in the most data cached in slow
Deposit the data that financial value is minimum;Delete the data that the caching financial value in described queue is minimum, the most data cached remaining difference
Towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue
The first.
Alternatively, in the net determining described data during cache information the unknown, to mantissa of the most data cached described squadron
According to net in cache information is the most known judges;If cache information is in the net of mantissa of the most data cached described squadron evidence
Know, then the data that the most data cached middle caching financial value described in lookup is minimum;Delete the caching financial value in described queue minimum
Data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Store the data in the first place of described queue.
Alternatively, if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then described tail of the queue number is deleted
According to, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;By institute
State data and be stored in the first place of described queue.
If during it should be noted that described data are buffered, described base station eNB performs the LRU of standard and is operable in response to
The request of data of subscriber equipment.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays
Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station
Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station
Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention two provides another kind of data cache method, for data buffer storage in above-described embodiment one
Process is described in detail, and the process of this data buffer storage is performed by base station eNB, refer to Fig. 1, this data cache method include with
Lower step:
The data that S101, reception subscriber equipment or content server send.
Wherein, the data that subscriber equipment sends are the data that user asks, and the data that content server sends are used for response
The data of family request, these data such as web page contents etc..
S102, judge that described data cache the most.
If the described data of S103 do not cache, then based on caching income algorithm, described data are cached.
Refer to Fig. 2, in embodiments of the present invention, based on caching income algorithm, described data carried out caching and include:
S1031, based on LRU, the most data cached form with queue is stored.
Wherein, described based on LRU, the most data cached form with queue is stored, i.e. with one
The form of individual chained list preserves the most data cached, and the general process that data cache in this chained list and call is: when new data needs
Time to be buffered, it is inserted into the head of this chained list;When the most data cached accessed time, these data are moved on to the head of this chained list;
When this storage of linked list expires when, the data of chained list afterbody are abandoned.
Caching financial value the most data cached described in S1032, acquisition, and obtain the caching financial value of described data, wherein,
The described the most corresponding the most data cached caching financial value.
In embodiments of the present invention, each data (including the content cached and content to be cached) are provided with one
Individual caching financial value, this caching financial value mainly accounts for from bandwidth and two angles of delay, the definition of this caching financial value
For: BVj(ci)=dj(ci)pj(ci)/s(ci), wherein ciRepresent the data cached and treat data cached, dj(ci) represent transmission
ciLength of delay, this transmission path is from ciPosition (such as router or provide ciContent server etc.) to eNB away from
From.pj(ci) represent ciThrough normalized popularity value on eNB, this popularity value embodies the request frequency being buffered data
Degree, in embodiments of the present invention, carry out capture content object according to power-law distribution (Mandelbrot-Zipf distribution) popular point
Cloth, and then calculate the popularity of this content object.s(ci) represent ciRequired spatial cache size.
S1033, judge the net of described data in cache information the most known.
In embodiments of the present invention, in the described net judging described data cache information whether it is known that be i.e. to judge that eNB is
No learn described data net in caching in storage condition, this storage condition include described data net in cache in by
Cache and be not buffered two kinds.Wherein, in this net, caching, specially has in information centre's network and delays data content
Depositing the router of function, in this net, caching is dispersed throughout whole network, possesses the memory space of magnanimity, it is possible to increase content in network
Efficiency of transmission, reduce communication delay, save network broadband, therefore, this network-caching can be as near the eNB of user terminal
Effectively supplementing of caching.
If cache information in the net of the described data of S1034 it is known that, search described in the most data cached middle caching financial value
Little data.
I.e. determine eNB and can learn described data storage condition in net in caching, now, store in eNB
Data cached chained list travels through, and finds out the data that wherein caching financial value is minimum.It should be noted that eNB is available
In net, the information of caching is the most, and the caching performance of this eNB is the best.
The data that S1035, the caching financial value deleted in described queue are minimum, remaining the most data cached respectively towards described
Mobile one of the tail of the queue order of queue, the first place making described queue is sky;
In embodiments of the present invention, described queue i.e. stores the chained list of data, deletes the number that described caching financial value is minimum
According to rear, the chained list of storage data vacates one, remaining the most data cached toward chained list gauge outfit direction from this room, suitable according to arrangement
Sequence fills up room to mobile one of the afterbody of described chained list order respectively, finally obtains the first for empty chained list.
S1036, store the data in the first place of described queue.
Wherein, described data are stored in the first place of described queue, i.e. store the data in the first in empty chained list.
On the basis of above-mentioned steps S1031-S1033, in the net determining described data during cache information the unknown, please
With reference to Fig. 3, based on caching income algorithm described data are cached and also include:
S1034 ', in the net determining described data during cache information the unknown, to mantissa of the most data cached described squadron
According to net in cache information is the most known judges.
I.e. determine eNB and can not learn described data storage condition in net in caching, now, determine whether
ENB whether can learn described in mantissa of the most data cached squadron according to storage condition in described network-caching.
S1035 ' if in the net of mantissa of the most data cached described squadron evidence cache information it is known that, search described in cache
Data cache the data that financial value is minimum.
I.e. determine eNB can learn described in mantissa of the most data cached squadron according to storage feelings in described network-caching
Condition, now, travels through storing the most data cached chained list in eNB, finds out the data that wherein caching financial value is minimum.
The data that S1036 ', the caching financial value deleted in described queue are minimum, remaining the most data cached respectively towards institute
Stating mobile one of the tail of the queue order of queue, the first place making described queue is sky.
In embodiments of the present invention, described queue i.e. stores the chained list of data, deletes the number that described caching financial value is minimum
According to rear, the chained list of storage data vacates one, remaining the most data cached toward chained list gauge outfit direction from this room, suitable according to arrangement
Sequence fills up room to mobile one of the afterbody of described chained list order respectively, finally obtains the first for empty chained list.
S1037 ', store the data in the first place of described queue.
Wherein, described data are stored in the first place of described queue, i.e. store the data in the first in empty chained list.
In above-mentioned steps S1034 ' on the basis of, if cache information is not in the net of mantissa of the most data cached described squadron evidence
Know, then delete described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make described
The first place of queue is empty;Store the data in the first place of described queue.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays
Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station
Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station
Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention three provides a kind of data buffer storage device, and corresponding to said method embodiment, these data are delayed
In cryopreservation device, the process of data buffer storage is performed by base station eNB, refer to Fig. 4, and this data buffer storage device 4 includes:
Receiver module 41, for receiving subscriber equipment or the data of content server transmission;
Judge module 42, is used for judging that described data cache the most;
Described data if not caching for described data, are then delayed by cache module 43 based on caching income algorithm
Deposit.
In embodiments of the present invention, the data received are sent to described judge module by described receiver module, according to institute
The judged result stating judge module is provided concrete data cache method by described cache module.
It should be noted that owing to the data buffer storage device of the present embodiment and the data cache method of embodiment one are based on phase
Same inventive concept, embodiment of the method can be the most applicable with the relevant art content in device embodiment, the most no longer describes in detail.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays
Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station
Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station
Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention four provides another kind of data buffer storage device, for filling the data buffer storage of above-described embodiment three
Putting and be specifically described, in this data buffer storage device, the process of data buffer storage is performed by base station eNB, refer to Fig. 5, and these data are delayed
Cryopreservation device 5 includes:
Receiver module 51, for receiving subscriber equipment or the data of content server transmission;
Judge module 52, is used for judging that described data cache the most;
Described data if not caching for described data, are then delayed by cache module 53 based on caching income algorithm
Deposit.
Described cache module 53 includes:
First processing unit 531, for carrying out the most data cached form with queue based on LRU
Storage;
First acquiring unit 532, is used for caching financial value the most data cached described in obtaining, and obtains the slow of described data
Deposit financial value, wherein, the described the most corresponding the most data cached caching financial value;
First judging unit 533, in the net judging described data, cache information is the most known;
First search unit 533a, if for described data net in cache information it is known that, search described in cached number
According to the data that middle caching financial value is minimum;First deletes unit 533a1, minimum for deleting the caching financial value in described queue
Data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
First memory element 533a2, for storing the data in the first place of described queue.
Second judging unit 533b, is used in the net determining described data during cache information the unknown, to described the most slow
In the net of mantissa of deposit data squadron evidence, cache information is the most known judges;Second search unit 533b1, if for described
Cache information in the net of mantissa of data cached squadron evidence is it is known that the minimum number of the most data cached middle caching financial value described in then searching
According to;Second deletes unit 533b2, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number
According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Second memory element 533b3, uses
In the first place storing the data in described queue.
3rd deletes unit 533b1 ', if cache information is unknown in the net of mantissa of the most data cached described squadron evidence,
Then delete described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make described team
The first place of row is empty;3rd memory element 533b2 ', for storing the data in the first place of described queue.
In embodiments of the present invention, the data received are sent to described judge module by described receiver module, according to institute
The judged result stating judge module is provided concrete data cache method by described cache module.
It should be noted that owing to the data buffer storage device of the present embodiment and the data cache method of embodiment two are based on phase
Same inventive concept, embodiment of the method can be the most applicable with the relevant art content in device embodiment, the most no longer describes in detail.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays
Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station
Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station
Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention five provides another kind of data buffering system 60, refer to Fig. 6, and described system 60 includes: interior
Hold server 601, the interior cache routing 602 of net, core net 603 and base station 604.
Described content server send user the data asked for response concurrent;In described net, cache routing is used for receiving institute
State the data of content server transmission and store data, and described data are sent to described core net, wherein, slow in described net
Deposit route and include at least one router, carry out content caching based on described router;Described core net be used for receiving described in
Hold the data that server sends, and described data are sent to described base station;Described base station is used for receiving subscriber equipment or content
The data that server sends, and judge that described data cache the most, if described data do not cache, then calculate based on caching income
Described data are cached by method.
In embodiments of the present invention, subscriber equipment, base station and core net collectively form LTE network.Wherein, described user
Equipment and general mobile communication system (Universal Mobile Telecommunications System, UMTS) and entirely
The subscriber equipment that ball mobile communication system (Global System for Mobile Communication, GSM) is used is
Consistent, it is all a kind of mobile device (Mobile Equipment, ME) possessing network transmission function.Described base station provides
Radio communication function between ME and described core net, described base station is evolved eNB, and this eNB controls one or more list
Subscriber equipment in unit.Described core net includes gateway (Serving Gate Way, SGW) and public data network
(Public Data Network, PDN) and PDN Gateway (PDN Gateway, PGW).UMTS and GSM is had with general by PGW
The gateway of packet wireless service technology (General Packet for Radio Service, GPRS) supports node (Gateway
GPRS Support Node, GGSN) and service GPRS support node (Service GPRS Support Node, SGSN) phase
Same effect.
In embodiments of the present invention, subscriber equipment may be coupled to the eNB of LTE network, and LTE network is connected by core net
To the Internet (including netting interior cache routing), and obtain data by the Internet from content server.When subscriber equipment is to net
In network during request data, data order can pass through content server, the Internet and LTE network, eventually arrives at user by eNB
Equipment.If data are cached by eNB, subscriber equipment just directly obtains data from eNB.
The most described base station carries out caching to described data include based on caching income algorithm: based on
The most data cached form with queue is stored by nearly minimum use algorithm;Caching income the most data cached described in acquisition
Value, and obtain the caching financial value of described data, wherein, the described the most corresponding the most data cached caching financial value;Judge
In the net of described data, cache information is the most known;If cache information in the net of described data it is known that, search described in cache
Data cache the data that financial value is minimum;Delete the data that the caching financial value in described queue is minimum, remaining cache
Data are respectively towards mobile one of the tail of the queue order of described queue, and the first place making described queue is sky;Store the data in institute
State the first place of queue.
In another embodiment, described caching described data based on caching income algorithm also includes: determining
In the net of described data during cache information the unknown, the most known to cache information in the net of mantissa of the most data cached described squadron evidence
Judge;If cache information in the net of mantissa of the most data cached described squadron evidence it is known that, search described in the most data cached in
The data that caching financial value is minimum;Delete the data that the caching financial value in described queue is minimum, remaining the most data cached point
Not towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue
First place.
In another embodiment, described caching described data based on caching income algorithm also includes: if described
In the net of mantissa of data cached squadron evidence, cache information is unknown, then delete described tail of the queue data, the most data cached remaining difference
Towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue
The first.
Embodiments providing a kind of data buffering system, within the system, eNB can utilize cache way in net
By information optimize local cache, thus improve the caching performance of eNB, decrease showing of end user web operating lag
As, reduce the network transmission cost of mobile operator.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all essences in the present invention
Any amendment, equivalent and the improvement etc. made within god and principle, should be included within the scope of the present invention.