CN107889160A - A kind of cell network edge part caching method for considering user's time delay - Google Patents
A kind of cell network edge part caching method for considering user's time delay Download PDFInfo
- Publication number
- CN107889160A CN107889160A CN201711132758.0A CN201711132758A CN107889160A CN 107889160 A CN107889160 A CN 107889160A CN 201711132758 A CN201711132758 A CN 201711132758A CN 107889160 A CN107889160 A CN 107889160A
- Authority
- CN
- China
- Prior art keywords
- base station
- cache
- cached
- small cell
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 238000004891 communication Methods 0.000 abstract description 2
- 208000000649 small cell carcinoma Diseases 0.000 abstract 2
- 238000005516 engineering process Methods 0.000 abstract 1
- 230000008901 benefit Effects 0.000 description 18
- 238000009826 distribution Methods 0.000 description 17
- 238000004088 simulation Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000011160 research Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W28/00—Network traffic management; Network resource management
- H04W28/02—Traffic management, e.g. flow control or congestion control
- H04W28/0278—Traffic management, e.g. flow control or congestion control using buffer status reports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/56—Provisioning of proxy services
- H04L67/568—Storing data temporarily at an intermediate stage, e.g. caching
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W28/00—Network traffic management; Network resource management
- H04W28/02—Traffic management, e.g. flow control or congestion control
- H04W28/06—Optimizing the usage of the radio link, e.g. header compression, information sizing, discarding information
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
The invention discloses a kind of cell network edge part caching method for considering user's time delay, belong to wireless communication technology field.First, buffered video file is alternatively collected and rearranged by temperature, and obtain corresponding hot value and corresponding video file size;Then, the maximum number K for the video file that can completely cache is calculated;It is iterated according to maximum number K using maximum variance between clusters, to less than or equal to cache threshold ScVideo file completely cached, to more than cache threshold ScVideo file carry out part caching;When user initiates content requests to small-cell base station, if video file is completely cached, user obtains complete content from the small-cell base station and terminated;If part caches or do not cached, by back haul link from the content server request content in core net.The present invention can guarantee that the access time delay of user, improve cache hit rate, lift the utilization rate of the spatial cache of small base station, effectively alleviate the bandwidth pressure of back haul link.
Description
Technical Field
The invention relates to the technical field of wireless communication, in particular to a small cell network edge part caching method considering user time delay.
Background
According to the white paper forecast of cisco, in 2019, mobile data flow accounts for about 75% of global data flow, most of the mobile data flow is borne by video flow, and with the global popularization of intelligent terminals and the global environment of self-media development, network video flow tends to increase exponentially. Today's macro cellular network capacity has not been able to expand further to meet the ever increasing traffic demands.
In order to solve the problem, a Small Cell (Small Cell) network is introduced, and under the coverage of a macro cellular network, the capacity of the system is greatly improved and the traffic demand on the network is relieved through the intensive deployment of Small Cell base stations. However, a weak small cell backhaul link weakens the advantage, so that edge caching becomes a research hotspot in recent years, and the edge caching can obtain a higher return by paying cheap storage resources. The edge cache enables a user to acquire a desired network resource nearby without passing through a weak backhaul link, so that on one hand, the request delay of the user is greatly reduced, and on the other hand, the transmission pressure of the backhaul link is relieved, and therefore, the service performance of the small cell system can be greatly improved by deploying a cache mechanism in the small cell base station in a mode of edge cache in the small cell system.
Since the storage capacity is still limited, each small cell base station node cannot cache all network videos, but only a very small part of the network videos can be cached, as can be seen from the twenty-eight law, 20% of video contents in the network bear 80% of network traffic, so 20% is the main considered caching object, but even then, the 20% of contents cannot be completely cached, so researchers begin to research a better caching configuration scheme to maximize the caching efficiency.
The existing edge caching scheme is divided into two categories according to cached nodes: one is a centralized cache and one is a distributed cache. The greatest advantage of the centralized cache is its simplicity, all nodes cache the same content, and the cache efficiency is low due to the limited cache capacity; the distributed cache has the advantages that the characteristic that a user is connected with a plurality of nodes is reasonably utilized, different contents are deployed at different nodes within a certain range, and therefore high cache efficiency is obtained. The caching mode of each file can be divided into a complete cache and a partial cache, wherein the complete cache completely caches each file to a node, and the partial cache caches a part of each file to the node.
In the current research of wireless network edge caching, most scholars design edge caching schemes for the case that the size of each file is assumed to be equal, and most scholars adopt a complete caching mode, that is, each file cached is complete. In addition, researchers also study partial caching, namely only caching a part of each file at the edge, and acquiring the rest of each file from a network center, and the benefit of partial caching is obviously higher than that of complete caching. But still based on the conclusion that each file is of equal size and therefore does not take into account the fact that the files are of different sizes. In practice, the difference of file sizes will affect the effectiveness of edge caching, whether full or partial caching.
In summary, in the prior art, on one hand, the existing edge cache research does not consider the situation of unequal file sizes, and on the other hand, the existing partial cache scheme does not provide the optimal cache percentage, so the benefit of the edge cache does not reach the optimal state. In the design scheme of the edge cache, the cache hit rate is not only dependent on the heat degree of each file, but also related to the size of each file, and the cache benefit is not only related to the cache hit rate, but also related to the burst amount of the backhaul link. One of the most important factors in the user experience is latency, and thus the design of the edge caching scheme cannot be decoupled from this factor.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a small cell network edge part caching method considering user time delay in order to maximize caching benefit in the limited edge caching space of a small cell network and according to the influence of different factors of file size on the caching benefit.
The method comprises the following specific steps:
step one, establishing a small cell network scene comprising a core network, a macro base station, a small cell base station and users covered around the small cell base station;
each small cell base station is used to deploy a content cache file.
Step two, setting the maximum capacity M of a cache region of a certain small cell base station and a cache video file alternative set of the small cell base station;
randomizing the sizes of all cached video files to obtain an alternative set of f 1 ,f 2 ,....f n };
Thirdly, rearranging the cache video file alternative set of the small base station into { f } according to the heat from high to low h1 ,f h2 ,....f hn Corresponding to a calorific value of { p } h1 ,p h2 ,....p hn And the corresponding video file size is { S } h1 ,S h2 ,....S hn };
Step four, aiming at the base station, on the premise of not exceeding the maximum capacity M of the cache region, calculating the maximum number K of video files which can be completely cached;
all the video files stored completely have the size of
Step five, under the condition of ensuring the time delay of the user, obtaining the optimal cache threshold S of the base station for each video file after the iteration times of the maximum inter-class variance method reach the maximum number K c ;
The method comprises the following specific steps:
firstly, aiming at a video requested to be played by a user, calculating conditions which need to be met when the user plays the video without initial delay or pause delay;
the specific conditions are as follows:
R 1 for download rate from small base station to terminal equipment, M p The content minimum value needs to be cached on the terminal equipment when the video is played; m p Satisfies M p <min{S hi And S c >M p ;S c A caching threshold for each video file; t is t 0 Maximum initial delay acceptable to the user; r 2 The download rate from the core network to the small base station; v is the playing speed of the video.
Then, calculating a buffer threshold value S according to the condition of satisfying the time delay c And obtaining a buffer threshold S c A number of results of (a);
the constraint is as follows:
finally, the maximum inter-class variance method is adopted to carry out cache threshold value S c Iterating the results for K times to obtain the cache threshold S corresponding to the maximum inter-class variance value c As an optimal result;
when partially cached the cached file satisfies S hi >S c The number of files of (1) is m, S hi <S c The number of files is n, and m + n = K;
step six, according to the optimal caching threshold S c Caching all cached video files sorted according to the popularity,for less than or equal to the buffer threshold S c The video file is completely cached; for the value greater than the cache threshold S c The video file of (2) is selected to have a length of S c The front end part of the system is cached;
the number of the completely cached video files is K.
Step seven, the user sends a content request to the small cell base station;
step eight, the small cell base station judges whether the request content is cached and is completely cached, if so, the user directly obtains the complete content from the small cell base station and ends; otherwise, entering the ninth step;
step nine, the small cell base station judges whether the request content is partially cached, if so, the step ten is carried out; otherwise, the requested content is not cached, and the user requests the content from the content server in the core network through the backhaul link.
Step ten, the user acquires the cached part of the content from the small cell base station, and simultaneously acquires the rest content from the content provider through the wireless backhaul link.
The invention has the advantages that:
1) The invention discloses a small cell network edge part caching method considering user time delay, which can ensure the access time delay of users, improve the cache hit rate and improve the utilization rate of the cache space of a small base station.
2) The small cell network edge part caching method considering the user time delay can effectively relieve the bandwidth pressure of a backhaul link.
Drawings
Figure 1 is a schematic diagram of a small cell network scenario established by the present invention;
fig. 2 is a flowchart of a small cell network edge portion caching method considering user delay according to the present invention;
FIG. 3 is a diagram illustrating caching of a cached video file according to an optimal caching threshold in accordance with the present invention;
FIG. 4 is a schematic diagram of a full cache and a partial cache of video files in forward, reverse, and random distribution;
FIG. 5 is a graph comparing the caching threshold of video files under three distributions with the number of cached files and the request hit rate, respectively;
FIG. 6 is a graph showing the effect of the buffer capacity and the average delay of the video files in ascending distribution when the complete buffer is compared with the optimal threshold buffer;
FIG. 7 is a graph showing the effect of the average delay on the buffer capacity of a video file in descending order distribution when the complete buffer is compared with the optimal threshold buffer;
fig. 8 is a graph of the effect of the buffer capacity and the average delay of randomly distributed video files in comparison between the full buffer and the optimal threshold buffer.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
According to the invention, under the model of a small cell network system, the traditional process of equally processing the sizes of all files to be cached is abandoned, the sizes of the files are randomized, the cache file alternative set of a small cell base station is determined according to the heat of the video files, and the different characteristics of the sizes of the files and the time delay sensitivity of a user to the video files have important influence on the caching benefit; the simulation finds that the random distribution of the sizes of the files causes great difference in the occupied cache spaces of the files with different heat degrees, and in order to ensure the maximization of the cache benefit and obtain the maximum cache benefit by the minimum cache space, the maximum inter-class variance method in the image field is introduced to solve the optimal cache threshold of each small cell base station; by adopting a new edge cache mode, namely, partial caching is carried out on the large file according to the optimal cache threshold value, and the small file is completely cached; the simulation verifies that the content part caching scheme based on the optimal threshold value has higher caching benefit than the complete caching.
As shown in fig. 2, the specific steps are as follows:
step one, establishing a small cell network scene comprising a core network, a macro base station, a small cell base station and users covered around the small cell base station;
the simulation scenario is established as shown in fig. 1, and each small cell base station has an edge cache capability to deploy a content cache file.
Step two, setting the maximum capacity M of a cache region of a certain small cell base station and a cache video file alternative set of the small cell base station;
randomizing the sizes of all cached video files in a network center to obtain an alternative set of F = { F = { (F) 1 ,f 2 ,....f n };M<F。
Thirdly, rearranging the cache video file alternative set of the small base station into { f } according to the heat from high to low h1 ,f h2 ,....f hn Corresponding to a calorific value of { p } h1 ,p h2 ,....p hn And the corresponding video file size is { S } h1 ,S h2 ,....S hn };
For the video heat distribution of the base station user obeys the twenty-eight law, 10% of hot videos account for 80% of video playing amount, and 90% of video contents only account for 20% of playing amount, so that 10% of hot contents are the object of the edge base station cache, but M < 10% f still exists, the prior art concludes that the distributed cache is better than the centralized cache, and therefore, in order to achieve better network efficiency, the distributed base station cache in a macro cell range is a more effective way.
However, in order to focus on analyzing the influence of the size of each video file on the cache hit rate and simplify the model, a centralized caching scheme is adopted, and an alternative set { f) of cached video files is assumed 1 ,f 2 ,....,f j ,...,f n Satisfy:
wherein S j For caching video files f j Is of a size of
There must be 0< lambda <1 such that
Cached video File alternate set { f 1 ,f 2 ,....,f j ,...,f n The file heat in the file is more than 80%, and the file is rearranged into { f ] according to the heat from high to low h1 ,f h2 ,....f hn With a corresponding heat size of { p } h1 ,p h2 ,....p hn And the corresponding video file size is { S } h1 ,S h2 ,....S hn };
Setting x hj E {0,1} represents file f hj If the cache is cached in the small cell base station cache, the following conditions are required to be met:
the cache hit rate at this time is expressed as:
it can be known that the cache hit rate is higher when the stored video files are more hot and the number of stored video files is more.
Step four, aiming at the base station, on the premise of not exceeding the maximum capacity M of the cache region, calculating the maximum number K of video files which can be completely cached;
in order to achieve higher cache hit rate, the invention completely caches the high-heat smaller video file, and only caches one part of the beginning of the file for the lower-heat larger video file, and the rest parts are obtained from a core network, and the better user experience can be ensured only by caching the whole files and the part of the files; but the method is realized on the premise of ensuring the quality of user experience:
all the video files stored completely have the size ofThe maximum value K corresponds to a heat value of p hK Satisfy p hK >p hn ;
Step five, under the condition of ensuring the time delay of the user, obtaining the optimal caching threshold value S of the base station for each video file after the iteration times of the maximum inter-class variance method reach the maximum number K c ;
To increase the cache hit rate, it is necessary to balance the sizes of the request hit rate and the byte hit rate, which may result in a decrease in the byte hit rate when the request hit rate is high. So that the threshold S is cached c The setting of (2) is crucial, and the problem of request delay needs to be considered.
The time delay is divided into two parts, namely an initial time delay before the requested video playing, and a pause time delay caused by network congestion and the like in the video playing process.
The method comprises the following specific steps:
firstly, aiming at a video requested to be played by a user, calculating a condition which needs to be met when the user plays the video without initial delay or pause delay;
when the front-end part content of the hot video file is cached in the small cell base station, the initial time delay is mainly determined by the size of the part of the file cached in the base station, and the download rate from the small base station to the terminal equipment is set as R 1 Core network toDownload rate of small base station is R 2 Suppose that M is cached on the terminal device p When the content size is small, the video can start playing, and at this time, an inequality M exists p <min{S hi And S c >M p Let the maximum initial delay accepted by the user be t 0 The conditions to be satisfied are:
for a file completely cached in the base station, it is assumed that the playing speed of the video is less than the downloading speed R 1 The video file completely cached in the base station will not have a pause delay in the playing process of the user.
And for a size S hi (S hi >S c ) In the small cell, only the front S is buffered c For long content, the following condition should be satisfied when the user plays the content with initial delay and without pause delay
V is the playing speed of the video.
Then, calculating a buffer threshold value S according to the condition of satisfying the time delay c And obtaining a buffer threshold S c A number of results of (a);
to ensure the quality of experience for the user, the constraint is as follows:
finally, iteration is carried out for K times by adopting a maximum between-class variance method, and a cache threshold value S corresponding to the maximum between-class variance value is obtained c As an optimal result;
suppose that the cached file satisfies S when partially cached hi >S c The number of files of (1) is m, S hi <S c The number of files is n, and m + n = K;
selecting the average value of K file sizes as an initial cache threshold S c Performing loop iteration, and calculating S with the maximum value of the formula c A value; further calculating according to the size of the obtained optimal cache threshold and the cache capacity of the small base station: setting an optimal buffer threshold S c Number of files N that can be cached later 2 。
Step six, according to the optimal caching threshold S c Designing a content part caching method of the small cell base station;
under the condition of limited cache capacity, the complete cache of the large file video occupies a larger storage space, so that the utilization efficiency of the storage space is greatly unfavorable. For these oversized video files, complete caching is not necessary, on one hand, most users can adopt a film watching mode of playing after offline caching, and at the moment, the requirements on real-time transmission delay are lower; on the other hand, the playing speed of the video is lower than the downlink transmission rate of the small cell base station, and the time delay requirement of the user can be ensured only by caching a part of the video in the edge network; moreover, for a smaller video file, the real-time requirement of a user is relatively high; therefore, a cache threshold is introduced to perform partial caching, when the size of the file is smaller than the cache threshold, complete caching is performed, and when the size of the file is larger than the cache threshold, only the part of the file with the front-segment content size as the threshold is cached, as shown in fig. 3, the shaded part is the cache part. Setting the caching threshold value helps to cache more smaller and more popular video files, so that better caching benefit is obtained.
The method specifically comprises the following steps:
caching all the cached video files in the alternative set from high to low according to the heat, and when the size of a certain video file is less than or equal to a caching threshold S c When the video file is cached, the video file is completely cached; otherwise, selecting the videoThe front part length of the file is the optimal buffer threshold S c Caching; the number of the video files which can be completely cached is K.
Step seven, a user initiates a content request to the small cell base station, and the request process is supposed to obey Poisson distribution;
step eight, the small cell base station judges whether the request content is cached and is completely cached, if so, the user directly obtains the complete content from the small cell base station and ends; otherwise, entering the ninth step;
each small cell base station stores the local hotspot video into a small cell Cache of a local area, when a small cell user requests content, if the small cell user caches corresponding content, the small cell base station directly provides service for the user, and if the small cell user does not have the corresponding content, the small cell base station obtains the content from a content server of a network center through a core network linked by a return link from the macro base station.
Step nine, the small cell base station judges whether the request content is partially cached, if so, the step ten is carried out; otherwise, the request content is not cached, and the user requests the content from the content server in the core network through the backhaul link.
Step ten, the user acquires the cached part of the content from the small cell base station, and simultaneously acquires the rest content from the content provider through the wireless backhaul link.
And finally, counting the cache hit times R of the user, the total request times A and the cache hit rate.
The invention has obvious influence on the benefit of edge cache by different video file size distribution, distinguishes the size of the video file by setting the cache threshold, introduces the maximum inter-class variance method into the edge cache to solve the optimal cache threshold in partial cache on the premise of ensuring the user delay tolerance, calculates the minimum energy consumption value of the corresponding system by jointly adjusting the cache strategy of the content and the access network selection scheme of the user, and obtains the optimal cache benefit.
In order to prove that different file sizes have a large influence on the caching benefit, the invention considers that MATLAB simulation is carried out on different file size distributions when the heat distribution of a cached video file alternative set is determined, as shown in FIG. 4, the assumption is made that the heat of a video file is uniformly distributed, the file distribution with higher heat and higher file size is in a reverse order distribution, the file distribution with higher heat and lower file size is in a positive order distribution, the meaning of a caching threshold value is that the file only caches the front-section content smaller than or equal to the caching threshold value, the left side in the figure represents the state during complete caching, the right side represents the partial caching state with the fixed threshold value of 500 under the same file size distribution as the left side, the abscissa represents the files arranged according to the heat, and the ordinate is the number of bytes of the file. It is known from the figure that the distribution of file sizes has a significant impact on caching efficiency, and the impact on the complete caching is greater than the impact on the partial caching. And the effect of partial cache is more stable than that of complete cache, so the adoption of the partial cache mode is more meaningful, and the performance of cache cannot ignore the factor of file size.
Setting a simulation scene of a hot spot video content part placement scheme as a common simulation network configuration of a small cell network, assuming that the cache capacity M is fixed to 20000M, 2000 cache candidate files are provided, the size of each file is a random value between 50M and 1500M, and the heat degree is discrete twenty-eight distribution; when the cache threshold is 500M, 1000 files with different sizes are randomly generated as an alternative set of cache.
The content placement scheme simulation parameters are shown in table 1:
TABLE 1
Parameter(s) | Value of |
Small base station downlink rate | 1.024(M/S) |
Return strokeLink downlink rate | 2.2(M/S) |
Playback rate | 0.72(M/S) |
As shown in fig. 5, the lower the cache threshold, the higher the request hit rate of the cache, but the request hit rate is not the only factor determining the network performance, and it is also necessary to consider the byte hit rate, the user delay, and the backhaul link occupation. When the cache threshold value is too low, although the request hit rate is high, the byte hit rate is reduced, so that not only is the cache benefit reduced, but also the bandwidth crisis of the backhaul link cannot be effectively alleviated. The size of the caching threshold has a significant impact on the caching efficiency.
The average delay effect graphs under different small base station buffer space capacities are shown in fig. 6, fig. 7 and fig. 8, the average delay of the user for acquiring the video resource is reduced along with the increase of the buffer capacity, and the comparison in the graphs shows that when the partial buffer method based on the optimal buffer threshold is adopted, the average delay of all the accesses of the user is lower than the average delay generated in the complete buffer method, so that the delay requirement of the user access can be ensured by adopting the partial buffer method based on the optimal buffer threshold; but also can improve the hit rate of the cache.
Claims (3)
1. A small cell network edge part caching method considering user time delay is characterized by comprising the following specific steps:
step one, establishing a small cell network scene comprising a core network, a macro base station, a small cell base station and users covered around the small cell base station;
step two, setting the maximum capacity M of a cache region of a certain small cell base station and a cache video file alternative set of the small cell base station;
all buffer memoryRandomizing the size of the video file to obtain an alternative set of { f 1 ,f 2 ,....f n };
Thirdly, rearranging the cache video file alternative set of the small base station into { f } according to the heat from high to low h1 ,f h2 ,....f hn Corresponding to a calorific value of { p } h1 ,p h2 ,....p hn And the corresponding video file size is { S } h1 ,S h2 ,....S hn };
Step four, aiming at the base station, on the premise of not exceeding the maximum capacity M of the cache region, calculating the maximum number K of video files which can be completely cached;
step five, under the condition of ensuring the time delay of the user, obtaining the optimal cache threshold S of the base station for each video file after the iteration times of the maximum inter-class variance method reach the maximum number K c ;
The method comprises the following specific steps:
firstly, aiming at a video requested to be played by a user, calculating a condition which needs to be met when the user plays the video without initial delay or pause delay;
the specific conditions are as follows:
R 1 for download rate from small base station to terminal equipment, M p The content minimum value needs to be cached on the terminal equipment when the video is played; m p Satisfies M p <min{S hi And S c >M p ;S c A caching threshold for each video file; t is t 0 Maximum initial delay acceptable to the user; r is 2 The download rate from the core network to the small base station; v is the playing speed of the video;
then, calculating a buffer threshold value S according to the condition of meeting the time delay c And obtaining a buffer threshold S c A number of results of (a);
the constraint is as follows:
finally, the maximum inter-class variance method is adopted to buffer the threshold value S c Iterating the results for K times to obtain the cache threshold S corresponding to the maximum inter-class variance value c As an optimal result;
the cached file satisfies S when partially cached hi >S c The number of files of (a) is m, S hi <S c The number of files is n, and m + n = K;
step six, according to the optimal caching threshold S c Caching all cached video files sorted according to the heat degree, and caching the cached video files less than or equal to a caching threshold value S c The video file is completely cached; for the value greater than the cache threshold S c The video file of (2) is selected to have a length of S c The front end part of the buffer memory;
k video files are completely cached;
step seven, the user sends a content request to the small cell base station;
step eight, the small cell base station judges whether the request content is cached and is completely cached, if so, the user directly obtains the complete content from the small cell base station and ends; otherwise, entering the ninth step;
step nine, the small cell base station judges whether the request content is partially cached, if so, the step ten is carried out; otherwise, the request content is not cached, and the user requests the content from a content server in the core network through a backhaul link;
step ten, the user acquires the cached part of the content from the small cell base station, and simultaneously acquires the rest content from the content provider through the wireless backhaul link.
2. The small cell network edge portion caching method considering user delay as claimed in claim 1, wherein the small cell network edge portion caching method is characterized in thatIn the fourth step, the size of all the completely stored video files satisfies
3. The small cell network edge part caching method considering the user delay as claimed in claim 1, wherein in the fifth step, the delay is divided into two parts, one is an initial delay before the requested video playing, and the other is a pause delay caused by network congestion and the like in the video playing process.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711132758.0A CN107889160B (en) | 2017-11-15 | 2017-11-15 | Small cell network edge part caching method considering user time delay |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711132758.0A CN107889160B (en) | 2017-11-15 | 2017-11-15 | Small cell network edge part caching method considering user time delay |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107889160A true CN107889160A (en) | 2018-04-06 |
CN107889160B CN107889160B (en) | 2020-03-17 |
Family
ID=61777474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711132758.0A Expired - Fee Related CN107889160B (en) | 2017-11-15 | 2017-11-15 | Small cell network edge part caching method considering user time delay |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107889160B (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109218747A (en) * | 2018-09-21 | 2019-01-15 | 北京邮电大学 | Video traffic classification caching method in super-intensive heterogeneous network based on user mobility |
CN109889578A (en) * | 2019-01-23 | 2019-06-14 | 中南大学 | A kind of transmission method and system of cloud-side collaboration processing |
CN110049507A (en) * | 2019-05-05 | 2019-07-23 | 南京工程学院 | Optimal buffer resource distribution method in wireless content distribution network based on martingale theory |
CN110213627A (en) * | 2019-04-23 | 2019-09-06 | 武汉理工大学 | Flow medium buffer distributor and its working method based on multiple cell user mobility |
CN110417847A (en) * | 2019-01-09 | 2019-11-05 | 北京邮电大学 | The method and device of Communication Network for UAVS user access and content caching |
CN110536179A (en) * | 2019-06-28 | 2019-12-03 | 三星电子(中国)研发中心 | A kind of content distribution system and method |
CN111465057A (en) * | 2020-03-30 | 2020-07-28 | 北京邮电大学 | Edge caching method and device based on reinforcement learning and electronic equipment |
CN111752905A (en) * | 2020-07-01 | 2020-10-09 | 浪潮云信息技术股份公司 | Large file distributed cache system based on object storage |
CN112839082A (en) * | 2020-12-31 | 2021-05-25 | 西安电子科技大学 | Heterogeneous edge cache allocation method, system, medium and application |
CN112954383A (en) * | 2021-03-02 | 2021-06-11 | 山东省计算中心(国家超级计算济南中心) | VOD method, VOD proxy server, base station and storage medium |
CN113301145A (en) * | 2020-05-21 | 2021-08-24 | 北京航空航天大学 | Mobile edge cache placement method adopting request rate and dynamic property of information source issued content |
CN113497976A (en) * | 2020-03-19 | 2021-10-12 | 华为技术有限公司 | Multimedia data downloading method and electronic equipment |
CN114630183A (en) * | 2022-03-17 | 2022-06-14 | 东南大学 | Edge device caching method and evaluation method based on scalable coding |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001020910A1 (en) * | 1999-09-14 | 2001-03-22 | Streaming21, Inc. | Method and apparatus for streaming data |
CN1610403A (en) * | 2004-11-16 | 2005-04-27 | 南京大学 | Method for realizing video requesting system based on protocol buffer storage |
CN101488967A (en) * | 2009-01-14 | 2009-07-22 | 深圳市同洲电子股份有限公司 | Video transmission method, embedded monitoring terminal and monitoring platform server |
CN103312776A (en) * | 2013-05-08 | 2013-09-18 | 青岛海信传媒网络技术有限公司 | Method and device for caching contents of videos by edge node server |
US20130263194A1 (en) * | 2010-12-03 | 2013-10-03 | Huawei Technologies Co., Ltd. | Cooperative caching method and apparatus |
CN103974097A (en) * | 2014-05-22 | 2014-08-06 | 南京大学镇江高新技术研究院 | Personalized user-generated video prefetching method and system based on popularity and social networks |
CN104641655A (en) * | 2013-04-07 | 2015-05-20 | 华为技术有限公司 | Terminal cache method, terminal and server |
CN104662528A (en) * | 2012-08-07 | 2015-05-27 | 谷歌公司 | Media content receiving device and distribution of media content utilizing social networks and social circles |
CN104967861A (en) * | 2015-05-27 | 2015-10-07 | 上海美琦浦悦通讯科技有限公司 | CDN video buffer system and method |
US20160142793A1 (en) * | 2004-11-12 | 2016-05-19 | Michael D. Abrams | Live Concert/Event Video System and Method |
CN106686399A (en) * | 2016-12-22 | 2017-05-17 | 陕西尚品信息科技有限公司 | Intra-network video buffering method based on combined buffering architecture |
-
2017
- 2017-11-15 CN CN201711132758.0A patent/CN107889160B/en not_active Expired - Fee Related
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2001020910A1 (en) * | 1999-09-14 | 2001-03-22 | Streaming21, Inc. | Method and apparatus for streaming data |
US20160142793A1 (en) * | 2004-11-12 | 2016-05-19 | Michael D. Abrams | Live Concert/Event Video System and Method |
CN1610403A (en) * | 2004-11-16 | 2005-04-27 | 南京大学 | Method for realizing video requesting system based on protocol buffer storage |
CN101488967A (en) * | 2009-01-14 | 2009-07-22 | 深圳市同洲电子股份有限公司 | Video transmission method, embedded monitoring terminal and monitoring platform server |
US20130263194A1 (en) * | 2010-12-03 | 2013-10-03 | Huawei Technologies Co., Ltd. | Cooperative caching method and apparatus |
CN104662528A (en) * | 2012-08-07 | 2015-05-27 | 谷歌公司 | Media content receiving device and distribution of media content utilizing social networks and social circles |
CN104641655A (en) * | 2013-04-07 | 2015-05-20 | 华为技术有限公司 | Terminal cache method, terminal and server |
CN103312776A (en) * | 2013-05-08 | 2013-09-18 | 青岛海信传媒网络技术有限公司 | Method and device for caching contents of videos by edge node server |
CN103974097A (en) * | 2014-05-22 | 2014-08-06 | 南京大学镇江高新技术研究院 | Personalized user-generated video prefetching method and system based on popularity and social networks |
CN104967861A (en) * | 2015-05-27 | 2015-10-07 | 上海美琦浦悦通讯科技有限公司 | CDN video buffer system and method |
CN106686399A (en) * | 2016-12-22 | 2017-05-17 | 陕西尚品信息科技有限公司 | Intra-network video buffering method based on combined buffering architecture |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109218747A (en) * | 2018-09-21 | 2019-01-15 | 北京邮电大学 | Video traffic classification caching method in super-intensive heterogeneous network based on user mobility |
CN110417847A (en) * | 2019-01-09 | 2019-11-05 | 北京邮电大学 | The method and device of Communication Network for UAVS user access and content caching |
CN110417847B (en) * | 2019-01-09 | 2020-09-01 | 北京邮电大学 | Method and device for user access and content caching in unmanned aerial vehicle communication network |
CN109889578A (en) * | 2019-01-23 | 2019-06-14 | 中南大学 | A kind of transmission method and system of cloud-side collaboration processing |
CN110213627B (en) * | 2019-04-23 | 2020-09-01 | 武汉理工大学 | Streaming media cache allocation method based on multi-cell user mobility |
CN110213627A (en) * | 2019-04-23 | 2019-09-06 | 武汉理工大学 | Flow medium buffer distributor and its working method based on multiple cell user mobility |
CN110049507A (en) * | 2019-05-05 | 2019-07-23 | 南京工程学院 | Optimal buffer resource distribution method in wireless content distribution network based on martingale theory |
CN110536179A (en) * | 2019-06-28 | 2019-12-03 | 三星电子(中国)研发中心 | A kind of content distribution system and method |
CN113497976A (en) * | 2020-03-19 | 2021-10-12 | 华为技术有限公司 | Multimedia data downloading method and electronic equipment |
CN111465057B (en) * | 2020-03-30 | 2021-06-04 | 北京邮电大学 | Edge caching method and device based on reinforcement learning and electronic equipment |
CN111465057A (en) * | 2020-03-30 | 2020-07-28 | 北京邮电大学 | Edge caching method and device based on reinforcement learning and electronic equipment |
CN113301145A (en) * | 2020-05-21 | 2021-08-24 | 北京航空航天大学 | Mobile edge cache placement method adopting request rate and dynamic property of information source issued content |
CN111752905A (en) * | 2020-07-01 | 2020-10-09 | 浪潮云信息技术股份公司 | Large file distributed cache system based on object storage |
CN111752905B (en) * | 2020-07-01 | 2024-04-09 | 浪潮云信息技术股份公司 | Large file distributed cache system based on object storage |
CN112839082A (en) * | 2020-12-31 | 2021-05-25 | 西安电子科技大学 | Heterogeneous edge cache allocation method, system, medium and application |
CN112839082B (en) * | 2020-12-31 | 2023-04-07 | 西安电子科技大学 | Heterogeneous edge cache allocation method, system, medium and application |
CN112954383A (en) * | 2021-03-02 | 2021-06-11 | 山东省计算中心(国家超级计算济南中心) | VOD method, VOD proxy server, base station and storage medium |
CN114630183A (en) * | 2022-03-17 | 2022-06-14 | 东南大学 | Edge device caching method and evaluation method based on scalable coding |
CN114630183B (en) * | 2022-03-17 | 2024-03-26 | 东南大学 | Edge equipment caching method and evaluation method based on scalable coding |
Also Published As
Publication number | Publication date |
---|---|
CN107889160B (en) | 2020-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107889160B (en) | Small cell network edge part caching method considering user time delay | |
CN107911711B (en) | Edge cache replacement improvement method considering partitions | |
CN106851731B (en) | A kind of D2D cache allocation method maximizing unloading probability | |
CN108093435B (en) | Cellular downlink network energy efficiency optimization system and method based on cached popular content | |
CN108834080B (en) | Distributed cache and user association method based on multicast technology in heterogeneous network | |
CN108737507B (en) | D2D wireless caching method | |
CN108600998B (en) | Cache optimization decision method for ultra-density cellular and D2D heterogeneous converged network | |
CN109600774B (en) | WiFi (Wireless Fidelity) unloading method based on alliance game in LTE (Long term evolution) network | |
CN110996293B (en) | Network deployment and resource allocation method and system for unmanned aerial vehicle | |
CN106998353B (en) | Optimal caching configuration method for files in content-centric networking | |
CN106686655A (en) | Heterogeneous network joint user correlation and content cache method | |
CN110290507B (en) | Caching strategy and spectrum allocation method of D2D communication auxiliary edge caching system | |
CN112218337A (en) | Cache strategy decision method in mobile edge calculation | |
CN109194763A (en) | Caching method based on small base station self-organizing cooperative in a kind of super-intensive network | |
CN110418367B (en) | 5G forwarding network hybrid edge cache low-delay method | |
CN108156596B (en) | Method for supporting D2D-cellular heterogeneous network combined user association and content caching | |
CN109451517B (en) | Cache placement optimization method based on mobile edge cache network | |
Chen et al. | Hit ratio driven mobile edge caching scheme for video on demand services | |
CN110913239B (en) | Video cache updating method for refined mobile edge calculation | |
CN109348454A (en) | A kind of D2D Cache Communication content sharing method | |
CN108566636A (en) | D2D random cache layout method oriented to different user preferences | |
CN108521640A (en) | A kind of content distribution method in cellular network | |
CN109495865A (en) | A kind of adaptive cache content laying method and system based on D2D auxiliary | |
CN111479312A (en) | Heterogeneous cellular network content caching and base station dormancy combined optimization method | |
CN111556531A (en) | Cooperative cache optimization method in micro-cellular wireless network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200317 |