CN106201924A - A kind of data cache method, Apparatus and system - Google Patents

A kind of data cache method, Apparatus and system Download PDF

Info

Publication number
CN106201924A
CN106201924A CN201610575448.5A CN201610575448A CN106201924A CN 106201924 A CN106201924 A CN 106201924A CN 201610575448 A CN201610575448 A CN 201610575448A CN 106201924 A CN106201924 A CN 106201924A
Authority
CN
China
Prior art keywords
data
queue
caching
cached
cache
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610575448.5A
Other languages
Chinese (zh)
Inventor
明中行
杨术
潘岱
成敏
陈仕科
Original Assignee
Shenzhen Ou Demeng Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ou Demeng Science And Technology Ltd filed Critical Shenzhen Ou Demeng Science And Technology Ltd
Priority to CN201610575448.5A priority Critical patent/CN106201924A/en
Publication of CN106201924A publication Critical patent/CN106201924A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/12Replacement control
    • G06F12/121Replacement control using replacement algorithms
    • G06F12/123Replacement control using replacement algorithms with age lists, e.g. queue, most recently used [MRU] list or least recently used [LRU] list
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/10Providing a specific technical effect
    • G06F2212/1016Performance improvement
    • G06F2212/1024Latency reduction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2212/00Indexing scheme relating to accessing, addressing or allocation within memory systems or architectures
    • G06F2212/15Use in a specific computing environment
    • G06F2212/154Networked environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The present invention relates to wireless communication technology field, particularly relate to a kind of data cache method, Apparatus and system.This data cache method includes: receive subscriber equipment or the data of content server transmission;Judge that described data cache the most;If described data do not cache, then based on caching income algorithm, described data are cached.The present invention mainly by carrying out data buffer storage by the way of caches in conjunction base station in net, improves the caching performance of base station, decreases the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.

Description

A kind of data cache method, Apparatus and system
[technical field]
The present invention relates to wireless communication technology field, particularly relate to a kind of data cache method, Apparatus and system.
[background technology]
Along with developing rapidly of mobile Internet, mobile data traffic is in explosive increase.Prediction is had to show, wireless Mobile data traffic is expected to present the growth of 40 times in future.Mobile data network infrastructure is caused by the flow of sharp increase Immense pressure, the cost of mobile operator significantly increases, additionally, the flow of sharp increase causes network congestion, makes terminal use Receive message delay, and then have a strong impact on Consumer's Experience.
At present, some modes are to solve the problems referred to above by increase infrastructure, such as, reduce the unit of cellular network Size and strengthen the spatial reuse intensity of the communication resource;Some modes are to come by the way of commercial means controls flow scale Solving the problems referred to above, such as, some operators of the U.S. have begun to turn to based on usage amount from flat rate service or on time Between valuation pricing strategy;Some modes are, in the position near user side, Web content is carried out distributed caching in order to solve The problems referred to above;Also having some modes is to solve the problems referred to above based on the AP cache way in cooperative caching.Additionally, also have logical Cross evolved Node B (Evolved Node B, eNodeB) and Web content is carried out caching to solve the problems referred to above.
During realizing the present invention, inventor finds that prior art at least there is problems in that increase infrastructure Cost is improved;The most fundamentally mobile data network resource is solved not by the way of commercial means controls flow scale The problem of foot, and the experience of terminal use may be had a strong impact on;The mode that Web content carries out distributed caching can drop The flow consumption of low mobile device, but, owing to electricity and the bandwidth of mobile device belong to scarce resource, this mode is in excitation Participant shares limited electricity and bandwidth aspect exists the biggest difficulty;Although additionally, on eNodeB (Evolved NodeB) Web content carries out caching to be limited by electricity and bandwidth resources, but simple eNodeB caches network traffics Congested relief capabilities is limited.
In consideration of it, overcoming the defect existing for above-mentioned prior art is the art problem demanding prompt solution.
[summary of the invention]
In order to overcome above-mentioned technical problem, the object of the invention aims to provide a kind of data cache method, Apparatus and system, its Solve the blast of radio data network flow and increase the network operating lag and the network transmission cost of mobile operator brought High problem.
For solving above-mentioned technical problem, embodiment of the present invention offer techniques below scheme:
In first aspect, the embodiment of the present invention provides a kind of data cache method, and it comprises the following steps:
Receive subscriber equipment or the data of content server transmission;
Judge that described data cache the most;
If described data do not cache, then based on caching income algorithm, described data are cached.
In certain embodiments, described described data carried out caching include based on caching income algorithm:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described The most corresponding data cached caching financial value;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum number of the most data cached middle caching financial value According to;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In certain embodiments, described caching described data based on caching income algorithm also includes:
In the net determining described data during cache information the unknown, in the net to mantissa of the most data cached described squadron evidence Cache information is the most known to be judged;
If cache information in the net of mantissa of the most data cached described squadron evidence it is known that, search described in the most data cached in slow Deposit the data that financial value is minimum;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In certain embodiments, described caching described data based on caching income algorithm also includes:
If cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then delete described tail of the queue data, remaining The most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Store the data in the first place of described queue.
In second aspect, the embodiment of the present invention also provides for a kind of data buffer storage device, including:
Receiver module, for receiving subscriber equipment or the data of content server transmission;
Judge module, is used for judging that described data cache the most;
Described data if not caching for described data, are then cached by cache module based on caching income algorithm.
In certain embodiments, described cache module includes:
First processing unit, for depositing the most data cached form with queue based on LRU Storage;
First acquiring unit, is used for caching financial value the most data cached described in obtaining, and obtains the caching of described data Financial value, wherein, the described the most corresponding the most data cached caching financial value;
First judging unit, in the net judging described data, cache information is the most known;
First search unit, if for described data net in cache information it is known that, search described in the most data cached in The data that caching financial value is minimum;
First deletes unit, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
First memory element, for storing the data in the first place of described queue.
In certain embodiments, described cache module also includes:
Second judging unit, for during cache information the unknown, having cached number to described in the net determining described data Judge according in the net of mantissa of squadron evidence, cache information is the most known;
Second search unit, if for mantissa of the most data cached described squadron evidence net in cache information it is known that, search The data that the most data cached described middle caching financial value is minimum;
Second deletes unit, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Second memory element, for storing the data in the first place of described queue.
In certain embodiments, described cache module also includes:
3rd deletes unit, if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then deletes Described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make the head of described queue Position is empty;
3rd memory element, for storing the data in the first place of described queue.
In the third aspect, the embodiment of the present invention also provides for a kind of data buffering system, including:
Cache routing, core net and base station in content server, net, wherein,
Described content server send user the data asked for response concurrent;
In described net, cache routing is for receiving the data of described content server transmission and storing data, and by described number According to sending to described core net;
Described core net is for receiving the data that described content server sends, and sends described data to described base Stand;
Described base station is for receiving subscriber equipment or the data of content server transmission, and judges that described data are the most slow Deposit, if described data do not cache, then based on caching income algorithm, described data are cached.
In certain embodiments, described base station carries out caching to described data include based on caching income algorithm:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described The most corresponding data cached caching financial value;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum number of the most data cached middle caching financial value According to;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards described queue Mobile one of tail of the queue order, the first place making described queue is sky;
Store the data in the first place of described queue.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
[accompanying drawing explanation]
Fig. 1 is the flow chart of a kind of data cache method that the embodiment of the present invention provides;
Fig. 2 is that the embodiment of the present invention provides a kind of based on the caching income algorithm stream to the method that described data cache Cheng Tu;
Fig. 3 is the method that described data are cached by the another kind that the embodiment of the present invention provides based on caching income algorithm Flow chart;
Fig. 4 is the structured flowchart of a kind of data buffer storage device that the embodiment of the present invention provides;
Fig. 5 is the structured flowchart of the another kind of data buffer storage device that the embodiment of the present invention provides;
Fig. 6 is the structural representation of a kind of data buffering system that the embodiment of the present invention provides.
[detailed description of the invention]
In order to make the purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, right The present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, not For limiting the present invention.
As long as additionally, technical characteristic involved in each embodiment of invention described below is the most not The conflict of composition just can be mutually combined.
In embodiments of the present invention, described LRU (Least Recently Used, LRU) is internal memory A kind of page replacement algorithm of management, is a kind of algorithm for virtual page mobile sms service.This algorithm is based on such a True: to use the page to be frequently likely to several the instructions below frequently use, conversely speaking, in above several instructions It is i.e. that the page the most not used is likely to will not be used within following long period of time, therefore, only need to be When exchanging, find that page of minimum use to recall internal memory every time.
In embodiments of the present invention, described base station (Evolved Node B, eNB) the most evolved Node B, it is long-term The title of evolution technology (Long Term Evolution, LTE) base station, this base station is integrated with part radio network controller The function of (Radio Network Controller, RNC), the level of agreement when decreasing communication.In the embodiment of the present invention Wireless access network is formed by comprising this multiple eNB.
It should be noted that the embodiment of the present invention is studied based on LTE network, LTE network is drilled as system architecture Enter the part of (System Architecture Evolution, SAE), be flat a, complete IP network, flat, complete IP framework makes LTE network can realize content caching in the net of message class.Although it should be noted that the embodiment of the present invention Study based on LTE network, but the scheme proposed be applicable not only to LTE network, it is also possible to it is applied to higher level Mobile system, such as following 5G network.
The embodiment of the present invention one provides a kind of data cache method, and the process of this data buffer storage is performed by base station eNB, please With reference to Fig. 1, this data cache method comprises the following steps:
The data that S101, reception subscriber equipment or content server send.
Wherein, the data that subscriber equipment sends are the data that user asks, and the data that content server sends are used for response The data of family request, these data such as web page contents etc..
S102, judge that described data cache the most.
If the described data of S103 do not cache, then based on caching income algorithm, described data are cached.
In embodiments of the present invention, based on caching income algorithm, described data are carried out caching to include: based on minimum Algorithm is used the most data cached form with queue to be stored;Caching financial value the most data cached described in acquisition, and obtain Take the caching financial value of described data, wherein, the described the most corresponding the most data cached caching financial value;Judge described data Net in cache information the most known;If cache information in the net of described data it is known that, search described in the most data cached in slow Deposit the data that financial value is minimum;Delete the data that the caching financial value in described queue is minimum, the most data cached remaining difference Towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue The first.
Alternatively, in the net determining described data during cache information the unknown, to mantissa of the most data cached described squadron According to net in cache information is the most known judges;If cache information is in the net of mantissa of the most data cached described squadron evidence Know, then the data that the most data cached middle caching financial value described in lookup is minimum;Delete the caching financial value in described queue minimum Data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky; Store the data in the first place of described queue.
Alternatively, if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then described tail of the queue number is deleted According to, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;By institute State data and be stored in the first place of described queue.
If during it should be noted that described data are buffered, described base station eNB performs the LRU of standard and is operable in response to The request of data of subscriber equipment.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention two provides another kind of data cache method, for data buffer storage in above-described embodiment one Process is described in detail, and the process of this data buffer storage is performed by base station eNB, refer to Fig. 1, this data cache method include with Lower step:
The data that S101, reception subscriber equipment or content server send.
Wherein, the data that subscriber equipment sends are the data that user asks, and the data that content server sends are used for response The data of family request, these data such as web page contents etc..
S102, judge that described data cache the most.
If the described data of S103 do not cache, then based on caching income algorithm, described data are cached.
Refer to Fig. 2, in embodiments of the present invention, based on caching income algorithm, described data carried out caching and include:
S1031, based on LRU, the most data cached form with queue is stored.
Wherein, described based on LRU, the most data cached form with queue is stored, i.e. with one The form of individual chained list preserves the most data cached, and the general process that data cache in this chained list and call is: when new data needs Time to be buffered, it is inserted into the head of this chained list;When the most data cached accessed time, these data are moved on to the head of this chained list; When this storage of linked list expires when, the data of chained list afterbody are abandoned.
Caching financial value the most data cached described in S1032, acquisition, and obtain the caching financial value of described data, wherein, The described the most corresponding the most data cached caching financial value.
In embodiments of the present invention, each data (including the content cached and content to be cached) are provided with one Individual caching financial value, this caching financial value mainly accounts for from bandwidth and two angles of delay, the definition of this caching financial value For: BVj(ci)=dj(ci)pj(ci)/s(ci), wherein ciRepresent the data cached and treat data cached, dj(ci) represent transmission ciLength of delay, this transmission path is from ciPosition (such as router or provide ciContent server etc.) to eNB away from From.pj(ci) represent ciThrough normalized popularity value on eNB, this popularity value embodies the request frequency being buffered data Degree, in embodiments of the present invention, carry out capture content object according to power-law distribution (Mandelbrot-Zipf distribution) popular point Cloth, and then calculate the popularity of this content object.s(ci) represent ciRequired spatial cache size.
S1033, judge the net of described data in cache information the most known.
In embodiments of the present invention, in the described net judging described data cache information whether it is known that be i.e. to judge that eNB is No learn described data net in caching in storage condition, this storage condition include described data net in cache in by Cache and be not buffered two kinds.Wherein, in this net, caching, specially has in information centre's network and delays data content Depositing the router of function, in this net, caching is dispersed throughout whole network, possesses the memory space of magnanimity, it is possible to increase content in network Efficiency of transmission, reduce communication delay, save network broadband, therefore, this network-caching can be as near the eNB of user terminal Effectively supplementing of caching.
If cache information in the net of the described data of S1034 it is known that, search described in the most data cached middle caching financial value Little data.
I.e. determine eNB and can learn described data storage condition in net in caching, now, store in eNB Data cached chained list travels through, and finds out the data that wherein caching financial value is minimum.It should be noted that eNB is available In net, the information of caching is the most, and the caching performance of this eNB is the best.
The data that S1035, the caching financial value deleted in described queue are minimum, remaining the most data cached respectively towards described Mobile one of the tail of the queue order of queue, the first place making described queue is sky;
In embodiments of the present invention, described queue i.e. stores the chained list of data, deletes the number that described caching financial value is minimum According to rear, the chained list of storage data vacates one, remaining the most data cached toward chained list gauge outfit direction from this room, suitable according to arrangement Sequence fills up room to mobile one of the afterbody of described chained list order respectively, finally obtains the first for empty chained list.
S1036, store the data in the first place of described queue.
Wherein, described data are stored in the first place of described queue, i.e. store the data in the first in empty chained list.
On the basis of above-mentioned steps S1031-S1033, in the net determining described data during cache information the unknown, please With reference to Fig. 3, based on caching income algorithm described data are cached and also include:
S1034 ', in the net determining described data during cache information the unknown, to mantissa of the most data cached described squadron According to net in cache information is the most known judges.
I.e. determine eNB and can not learn described data storage condition in net in caching, now, determine whether ENB whether can learn described in mantissa of the most data cached squadron according to storage condition in described network-caching.
S1035 ' if in the net of mantissa of the most data cached described squadron evidence cache information it is known that, search described in cache Data cache the data that financial value is minimum.
I.e. determine eNB can learn described in mantissa of the most data cached squadron according to storage feelings in described network-caching Condition, now, travels through storing the most data cached chained list in eNB, finds out the data that wherein caching financial value is minimum.
The data that S1036 ', the caching financial value deleted in described queue are minimum, remaining the most data cached respectively towards institute Stating mobile one of the tail of the queue order of queue, the first place making described queue is sky.
In embodiments of the present invention, described queue i.e. stores the chained list of data, deletes the number that described caching financial value is minimum According to rear, the chained list of storage data vacates one, remaining the most data cached toward chained list gauge outfit direction from this room, suitable according to arrangement Sequence fills up room to mobile one of the afterbody of described chained list order respectively, finally obtains the first for empty chained list.
S1037 ', store the data in the first place of described queue.
Wherein, described data are stored in the first place of described queue, i.e. store the data in the first in empty chained list.
In above-mentioned steps S1034 ' on the basis of, if cache information is not in the net of mantissa of the most data cached described squadron evidence Know, then delete described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make described The first place of queue is empty;Store the data in the first place of described queue.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention three provides a kind of data buffer storage device, and corresponding to said method embodiment, these data are delayed In cryopreservation device, the process of data buffer storage is performed by base station eNB, refer to Fig. 4, and this data buffer storage device 4 includes:
Receiver module 41, for receiving subscriber equipment or the data of content server transmission;
Judge module 42, is used for judging that described data cache the most;
Described data if not caching for described data, are then delayed by cache module 43 based on caching income algorithm Deposit.
In embodiments of the present invention, the data received are sent to described judge module by described receiver module, according to institute The judged result stating judge module is provided concrete data cache method by described cache module.
It should be noted that owing to the data buffer storage device of the present embodiment and the data cache method of embodiment one are based on phase Same inventive concept, embodiment of the method can be the most applicable with the relevant art content in device embodiment, the most no longer describes in detail.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention four provides another kind of data buffer storage device, for filling the data buffer storage of above-described embodiment three Putting and be specifically described, in this data buffer storage device, the process of data buffer storage is performed by base station eNB, refer to Fig. 5, and these data are delayed Cryopreservation device 5 includes:
Receiver module 51, for receiving subscriber equipment or the data of content server transmission;
Judge module 52, is used for judging that described data cache the most;
Described data if not caching for described data, are then delayed by cache module 53 based on caching income algorithm Deposit.
Described cache module 53 includes:
First processing unit 531, for carrying out the most data cached form with queue based on LRU Storage;
First acquiring unit 532, is used for caching financial value the most data cached described in obtaining, and obtains the slow of described data Deposit financial value, wherein, the described the most corresponding the most data cached caching financial value;
First judging unit 533, in the net judging described data, cache information is the most known;
First search unit 533a, if for described data net in cache information it is known that, search described in cached number According to the data that middle caching financial value is minimum;First deletes unit 533a1, minimum for deleting the caching financial value in described queue Data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky; First memory element 533a2, for storing the data in the first place of described queue.
Second judging unit 533b, is used in the net determining described data during cache information the unknown, to described the most slow In the net of mantissa of deposit data squadron evidence, cache information is the most known judges;Second search unit 533b1, if for described Cache information in the net of mantissa of data cached squadron evidence is it is known that the minimum number of the most data cached middle caching financial value described in then searching According to;Second deletes unit 533b2, for deleting the data that the caching financial value in described queue is minimum, remaining has cached number According to respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Second memory element 533b3, uses In the first place storing the data in described queue.
3rd deletes unit 533b1 ', if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, Then delete described tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, make described team The first place of row is empty;3rd memory element 533b2 ', for storing the data in the first place of described queue.
In embodiments of the present invention, the data received are sent to described judge module by described receiver module, according to institute The judged result stating judge module is provided concrete data cache method by described cache module.
It should be noted that owing to the data buffer storage device of the present embodiment and the data cache method of embodiment two are based on phase Same inventive concept, embodiment of the method can be the most applicable with the relevant art content in device embodiment, the most no longer describes in detail.
In embodiments of the present invention, the data mainly received base station by caching income algorithm are cached, and this delays Deposit income algorithm and mainly treat cache information in data cached net by judging that base station is the most known, thus selected by base station Treat data cached cache way, by the way of caches in conjunction base station carries out data buffer storage in this net, improve base station Caching performance, decrease the phenomenon of end user web operating lag, reduce the network transmission cost of mobile operator.
The embodiment of the present invention five provides another kind of data buffering system 60, refer to Fig. 6, and described system 60 includes: interior Hold server 601, the interior cache routing 602 of net, core net 603 and base station 604.
Described content server send user the data asked for response concurrent;In described net, cache routing is used for receiving institute State the data of content server transmission and store data, and described data are sent to described core net, wherein, slow in described net Deposit route and include at least one router, carry out content caching based on described router;Described core net be used for receiving described in Hold the data that server sends, and described data are sent to described base station;Described base station is used for receiving subscriber equipment or content The data that server sends, and judge that described data cache the most, if described data do not cache, then calculate based on caching income Described data are cached by method.
In embodiments of the present invention, subscriber equipment, base station and core net collectively form LTE network.Wherein, described user Equipment and general mobile communication system (Universal Mobile Telecommunications System, UMTS) and entirely The subscriber equipment that ball mobile communication system (Global System for Mobile Communication, GSM) is used is Consistent, it is all a kind of mobile device (Mobile Equipment, ME) possessing network transmission function.Described base station provides Radio communication function between ME and described core net, described base station is evolved eNB, and this eNB controls one or more list Subscriber equipment in unit.Described core net includes gateway (Serving Gate Way, SGW) and public data network (Public Data Network, PDN) and PDN Gateway (PDN Gateway, PGW).UMTS and GSM is had with general by PGW The gateway of packet wireless service technology (General Packet for Radio Service, GPRS) supports node (Gateway GPRS Support Node, GGSN) and service GPRS support node (Service GPRS Support Node, SGSN) phase Same effect.
In embodiments of the present invention, subscriber equipment may be coupled to the eNB of LTE network, and LTE network is connected by core net To the Internet (including netting interior cache routing), and obtain data by the Internet from content server.When subscriber equipment is to net In network during request data, data order can pass through content server, the Internet and LTE network, eventually arrives at user by eNB Equipment.If data are cached by eNB, subscriber equipment just directly obtains data from eNB.
The most described base station carries out caching to described data include based on caching income algorithm: based on The most data cached form with queue is stored by nearly minimum use algorithm;Caching income the most data cached described in acquisition Value, and obtain the caching financial value of described data, wherein, the described the most corresponding the most data cached caching financial value;Judge In the net of described data, cache information is the most known;If cache information in the net of described data it is known that, search described in cache Data cache the data that financial value is minimum;Delete the data that the caching financial value in described queue is minimum, remaining cache Data are respectively towards mobile one of the tail of the queue order of described queue, and the first place making described queue is sky;Store the data in institute State the first place of queue.
In another embodiment, described caching described data based on caching income algorithm also includes: determining In the net of described data during cache information the unknown, the most known to cache information in the net of mantissa of the most data cached described squadron evidence Judge;If cache information in the net of mantissa of the most data cached described squadron evidence it is known that, search described in the most data cached in The data that caching financial value is minimum;Delete the data that the caching financial value in described queue is minimum, remaining the most data cached point Not towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue First place.
In another embodiment, described caching described data based on caching income algorithm also includes: if described In the net of mantissa of data cached squadron evidence, cache information is unknown, then delete described tail of the queue data, the most data cached remaining difference Towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;Store the data in described queue The first.
Embodiments providing a kind of data buffering system, within the system, eNB can utilize cache way in net By information optimize local cache, thus improve the caching performance of eNB, decrease showing of end user web operating lag As, reduce the network transmission cost of mobile operator.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all essences in the present invention Any amendment, equivalent and the improvement etc. made within god and principle, should be included within the scope of the present invention.

Claims (10)

1. a data cache method, it is characterised in that described method includes:
Receive subscriber equipment or the data of content server transmission;
Judge that described data cache the most;
If described data do not cache, then based on caching income algorithm, described data are cached.
2. the method for claim 1, it is characterised in that described based on caching income algorithm described data are cached Including:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described caches The most corresponding caching financial value of data;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum data of the most data cached middle caching financial value;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards the tail of the queue of described queue Mobile one of order, the first place making described queue is sky;
Store the data in the first place of described queue.
3. method as claimed in claim 2, it is characterised in that described method also includes:
In the net determining described data during cache information the unknown, cache in the net of mantissa of the most data cached described squadron evidence Information is the most known to be judged;
If cache information in the net of mantissa of the most data cached described squadron evidence it is known that, search described in the most data cached middle caching receive The data that benefit value is minimum;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards the tail of the queue of described queue Mobile one of order, the first place making described queue is sky;
Store the data in the first place of described queue.
4. method as claimed in claim 3, it is characterised in that described method also includes:
If cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then delete described tail of the queue data, remaining Data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Store the data in the first place of described queue.
5. a data buffer storage device, it is characterised in that described device includes:
Receiver module, for receiving subscriber equipment or the data of content server transmission;
Judge module, is used for judging that described data cache the most;
Described data if not caching for described data, are then cached by cache module based on caching income algorithm.
6. device as claimed in claim 5, it is characterised in that described cache module includes:
First processing unit, for storing the most data cached form with queue based on LRU;
First acquiring unit, is used for caching financial value the most data cached described in obtaining, and obtains the caching income of described data Value, wherein, the described the most corresponding the most data cached caching financial value;
First judging unit, in the net judging described data, cache information is the most known;
First search unit, if for described data net in cache information it is known that, search described in the most data cached middle caching The data that financial value is minimum;
First deletes unit, for deleting the data that the caching financial value in described queue is minimum, and remaining the most data cached point Not towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
First memory element, for storing the data in the first place of described queue.
7. device as claimed in claim 6, it is characterised in that described cache module also includes:
Second judging unit, in the net determining described data during cache information the unknown, to described the most data cached in In the net of tail of the queue data, cache information is the most known judges;
Second searches unit, if in the net of mantissa of the most data cached described squadron evidence cache information it is known that, search described The data that the most data cached middle caching financial value is minimum;
Second deletes unit, for deleting the data that the caching financial value in described queue is minimum, and remaining the most data cached point Not towards mobile one of the tail of the queue order of described queue, the first place making described queue is sky;
Second memory element, for storing the data in the first place of described queue.
8. device as claimed in claim 7, it is characterised in that described cache module also includes:
3rd deletes unit, if cache information is unknown in the net of mantissa of the most data cached described squadron evidence, then deletes described Tail of the queue data, remaining the most data cached respectively towards mobile one of the tail of the queue order of described queue, the first place making described queue is Empty;
3rd memory element, for storing the data in the first place of described queue.
9. a data buffering system, it is characterised in that described system includes: cache routing, core net in content server, net And base station, wherein,
Described content server send user the data asked for response concurrent;
In described net, cache routing is for receiving the data of described content server transmission and storing data, and described data is sent out Deliver to described core net;
Described core net is for receiving the data that described content server sends, and sends described data to described base station;
Described base station is for receiving subscriber equipment or the data of content server transmission, and judges that described data cache the most, If described data do not cache, then based on caching income algorithm, described data are cached.
10. system as claimed in claim 9, it is characterised in that described data are entered by described base station based on caching income algorithm Row cache includes:
Based on LRU, the most data cached form with queue is stored;
Caching financial value the most data cached described in acquisition, and obtain the caching financial value of described data, wherein, described caches The most corresponding caching financial value of data;
In judging the net of described data, cache information is the most known;
If cache information in the net of described data it is known that, search described in the minimum data of the most data cached middle caching financial value;
Delete the data that the caching financial value in described queue is minimum, remaining the most data cached respectively towards the tail of the queue of described queue Mobile one of order, the first place making described queue is sky;
Store the data in the first place of described queue.
CN201610575448.5A 2016-07-19 2016-07-19 A kind of data cache method, Apparatus and system Pending CN106201924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610575448.5A CN106201924A (en) 2016-07-19 2016-07-19 A kind of data cache method, Apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610575448.5A CN106201924A (en) 2016-07-19 2016-07-19 A kind of data cache method, Apparatus and system

Publications (1)

Publication Number Publication Date
CN106201924A true CN106201924A (en) 2016-12-07

Family

ID=57491098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610575448.5A Pending CN106201924A (en) 2016-07-19 2016-07-19 A kind of data cache method, Apparatus and system

Country Status (1)

Country Link
CN (1) CN106201924A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115442439A (en) * 2022-08-31 2022-12-06 云知声智能科技股份有限公司 Distributed cache cluster management method, system, terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1387643A (en) * 1999-09-01 2002-12-25 奈克斯特威电信公司 Distributed cache for wireless communication system
CN103458467A (en) * 2012-06-05 2013-12-18 华为技术有限公司 Caching system, device and method applied to network
CN103514106A (en) * 2012-06-20 2014-01-15 北京神州泰岳软件股份有限公司 Method for caching data
CN103781115A (en) * 2014-01-25 2014-05-07 浙江大学 Distributed base station cache replacement method based on transmission cost in cellular network
CN104125607A (en) * 2013-04-23 2014-10-29 中兴通讯股份有限公司 User plane congestion processing method and device, and service gateway

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1387643A (en) * 1999-09-01 2002-12-25 奈克斯特威电信公司 Distributed cache for wireless communication system
CN103458467A (en) * 2012-06-05 2013-12-18 华为技术有限公司 Caching system, device and method applied to network
CN103514106A (en) * 2012-06-20 2014-01-15 北京神州泰岳软件股份有限公司 Method for caching data
CN104125607A (en) * 2013-04-23 2014-10-29 中兴通讯股份有限公司 User plane congestion processing method and device, and service gateway
CN103781115A (en) * 2014-01-25 2014-05-07 浙江大学 Distributed base station cache replacement method based on transmission cost in cellular network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHONGXING MING, ET AL.: "InCan: In-network cache assisted eNodeB caching mechanism in 4G LTE networks", 《ELSEVIER》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115442439A (en) * 2022-08-31 2022-12-06 云知声智能科技股份有限公司 Distributed cache cluster management method, system, terminal and storage medium

Similar Documents

Publication Publication Date Title
JP5892164B2 (en) Content distribution system, control device, and content distribution method
CN104683485B (en) A kind of Internet content cache based on C RAN preloads method and system
US8271610B2 (en) Distributed content caching solution for a mobile wireless network
CN103650420B (en) Carrier control based on detection
WO2019052376A1 (en) Service processing method, mobile edge computing device, and network device
US9332387B2 (en) Prefetching and caching map data based on mobile network coverage
US20120184258A1 (en) Hierarchical Device type Recognition, Caching Control & Enhanced CDN communication in a Wireless Mobile Network
CN109729181A (en) A kind of method for accessing domain name and equipment
US20110320592A1 (en) Methods, systems, and computer readable media for content delivery using deep packet inspection
CN105357281B (en) A kind of Mobile Access Network distributed content cache access control method and system
US20140310339A1 (en) Content delivery method and apparatus, and access network device
US9380486B2 (en) Method and system for signaling reduction on radio access networks using targeted intelligence for communication devices
CN108024284B (en) Wireless communication method, user equipment access network equipment and core network equipment
CN105704708A (en) Mobile network content distribution method, device and system
JP2014241135A (en) Communication method of node in content centric network, and node therefor
US20110125820A1 (en) Telecommunication network aggregation cache system and method
KR20150021437A (en) Method of managing content caching for wireless networks
JP2014531810A (en) Communication terminal and method
KR101670910B1 (en) Efficient cache selection for content delivery networks and user equipments
CN107872478A (en) A kind of content buffering method, device and system
CN107071015A (en) A kind of document transmission method and device applied to vehicular ad hoc network
Brik et al. RCS-VC: Renting out and consuming services in vehicular clouds based on LTE-A
Fang et al. A cooperative caching algorithm for cluster-based vehicular content networks with vehicular caches
US10862858B2 (en) Information centric approach in achieving anycast in machine type communications
US10904788B2 (en) Controlling a congestion window value for a wireless device in a heterogeneous network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190313

Address after: 518000, 1st Floor 107, Phase I Complex Building of Tsinghua Information Port, North District of Xili Street High-tech Zone, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Ming Zhongxing

Address before: 518052 High-tech Industrial Park, Nanshan District, Shenzhen City, Guangdong Province (North District) Tsinghua Information Port Complex Building 201

Applicant before: Shenzhen Ou Demeng Science and Technology Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20161207