CN105262833B - A kind of the cross-layer caching method and its node of content center network - Google Patents

A kind of the cross-layer caching method and its node of content center network Download PDF

Info

Publication number
CN105262833B
CN105262833B CN201510729060.1A CN201510729060A CN105262833B CN 105262833 B CN105262833 B CN 105262833B CN 201510729060 A CN201510729060 A CN 201510729060A CN 105262833 B CN105262833 B CN 105262833B
Authority
CN
China
Prior art keywords
node
content
caching
interest
interest packet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510729060.1A
Other languages
Chinese (zh)
Other versions
CN105262833A (en
Inventor
张天魁
肖霖
武丽霞
许晓耕
杨鼎成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang University
Beijing University of Posts and Telecommunications
Original Assignee
Nanchang University
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang University, Beijing University of Posts and Telecommunications filed Critical Nanchang University
Priority to CN201510729060.1A priority Critical patent/CN105262833B/en
Publication of CN105262833A publication Critical patent/CN105262833A/en
Application granted granted Critical
Publication of CN105262833B publication Critical patent/CN105262833B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/60Scheduling or organising the servicing of application requests, e.g. requests for application data transmissions using the analysis and optimisation of the required network resources
    • H04L67/63Routing a service request depending on the request content or context
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/568Storing data temporarily at an intermediate stage, e.g. caching

Abstract

The application provides a kind of the cross-layer caching method and its node of content center network, and this method comprises the following steps:Step S1:Content requestor is initiated to carry the interest packet of request content name according to the demand of user;Step S2:Node receives interest packet, judges whether the node is content provider or the cache hit node that can provide request content;If not, thening follow the steps S3;Step S3:Pending required list is searched, judges the name for whether having the request content in pending required list, it is no to then follow the steps S5 if there is thening follow the steps S4;Step S4:If there is the name of the request content in PIT, indicate that the interest packet once reached the node, then increase the node port of interest packet arrival in the name entry of the request content of PIT, and abandon the interest packet, terminates this method.

Description

A kind of the cross-layer caching method and its node of content center network
Technical field
This application involves network-caching method, more particularly to the method and its section of the cross-layer caching of a kind of content center network Point.
Background technology
Mobile flow rapid development, and the consumption of most flow both is from video flowing.It is interior in large scale network Hold shared and transmission and directly affect transmission performance, influences user experience.How by these a large amount of data with faster speed and Lower network congestion distribution becomes a great challenge.Traditional IP-based end to end communication pattern in data transmission Although achieving certain success in terms of content distribution, it is faced always in mobility, flexibility and congestion etc. Great problem.Therefore, a kind of novel network architecture CCN (Content Centric Networking) is suggested to reality Now quick content distribution.Content center network realizes acquisition and the caching of information by caching technology in net, allows user can be with Content is obtained from request user to the intermediate node of transmission path content server, user is greatly reduced and obtains content Time delay, reduce the congestion of communication link, while reducing the load of server.
Caching is important one of research direction in the net of CCN networks.W.K.Chai exists《Cache"less for more" in information-centric networks》The middle betweenness proposed according to social networks interior joint Centrality values select cache node.Wherein betweenness centrality are a kind of evaluation network node importance Method, node bigger betweenness centrality, the number that shortest path in network passes through is bigger, works as section When point is according to shortest path transferring content, the probability bigger of approach content on node.CCN is different from IP network, it selects to allow more More contents caches node big betweenness centrality in a network, when user's request content, Ke Yizhi It connects from the caching intermediate node in transmission path and obtains, be not required to obtain content from server, substantially reduce the network of server Load reduces the acquisition time delay of content, promotes content sharing efficiency.Each router is empty in local caching by content caching Between, when spatial cache has been expired, old content replacement is updated to new content by router node according to certain replacement principle. There are identical copies for different nodes in a network.
But when what content caching where to be the problem of caching design relates generally to by, therefore how root It is most important to the performance of caching that effective cache policy is designed according to specific application scenarios.Due to network parameter with caching and Constantly variation studies the cross-layer caching design of node now also in the elementary step, to social networks scene under dynamic network For, spatial cache is limited, how to improve the storage efficiency of spatial cache, and smaller content caching replaces expense, reduces user and obtains Content time delay is taken, improving content hit rate becomes important research topic.
I.e. existing caching method has the following defects:
First, according to betweenness center propose under wired network scene along path cache policy, in flattening Distributed scene under research it is less, be distributed ad-hoc form under social networks scene, between user, therefore more Research has more actual meaning close under the complex network of live network.
Secondly, existing along path cache policy, only from network layer because usually determine node or user whether into Row caching, although important node can greatly reduce content obtaining time delay, the cached copies number of node is more, nodal cache space It needs constantly to replace update, a large amount of loads of important node are easy to shorten the service life of node, which which node should cache Content directly affects network-caching performance.
Again, the factor of cache location is mainly to consider the parameter index of network single layer, and single layer index can not be effective Improve network transmission efficiency.It, can not be according between node in transmission path when more scene multi-parameters in network are compared Multi-parameter relational design efficiently be suitable for social networks caching design scheme.
Invention content
In view of this, the application provides a kind of the cross-layer caching method and its node of content center network, flattening is selected Complex network as application scenarios so that this method is more nearly the actual scene of social networks so that research it is more real Meaning.While considering network layer node importance the replacement frequency for considering physical layer node is added, and from society in the present invention The feature of network is handed over to set out, it is contemplated that user is to the preference of content, the standard that preference is weighed node as one.Most Afterwards, the present invention uses a kind of method of gray scale association analysis, multifactor by the multinode on content transmission path to combine, Propose the cross-layer cache policy of comprehensive physical layer, network layer and application layer.
The application provides a kind of cross-layer caching method of content center network, includes the following steps:
Step S1:Content requestor is initiated to carry the interest packet of request content name according to the demand of user;
Step S2:Node receive interest packet, judge the node whether be can provide request content content provider or Cache hit node;If not, thening follow the steps S3;
Step S3:Pending required list is searched, the name for whether having the request content in pending required list is judged, if there is then Step S4 is executed, it is no to then follow the steps S5;
Step S4:If there is the name of the request content in pending required list, indicate that the interest packet once reached the section Point then increases the node port of interest packet arrival in the name entry of the request content of pending required list, and abandoning should Interest packet terminates this method;
Step S5:If not having the content item in pending required list, the triple of the node is added for interest packet, to Interest packet after other nodes forwarding addition triple, these nodes for having received interest packet continue to execute step S2, until looking into Find the content provider that request content can be provided or cache hit node.
The application also provides a kind of node of content center network, including:
Transceiver:Interest packet and data packet are received, and forwards interest packet and data packet;
Judgment means:Judge whether the node is content provider or the cache hit node that can provide request content, If it is not, then device is searched in triggering;
Search device:Pending required list can be inquired, judges the name for whether having the request content in pending required list;If The name for having the request content in pending required list indicates that the interest packet once reached the node, then in pending required list Increase the node of interest packet arrival in the name entry of the request content, and abandons the interest packet;If in pending required list The not content item then adds the triple of the node for interest packet, and the interest packet after addition triple is sent to transmitting-receiving Device is forwarded.
By above technical scheme as it can be seen that the application can reach following technique effect:
(1) present invention considers the design of the nodal cache under social networks scene, is carried according to the topological property of social networks The complex network CCN caching design schemes for going out self-organizing flattening, to allow CCN cachings to provide better service for social networks Quality;
(2) as one of caching factor, important node is held the importance of present invention consideration network node layer in a network Load more importantly acts on, and very important effect is played to the caching performance of this network;
(3) present invention is changed according to the feature of more caching factors on content transmission path with dynamic network is more applicable for Gray scale correlation fractal dimension carry out comprehensive analysis from the physical layer, network layer, application layer of network and determine cache location and cache policy, It proposes significantly more efficient cache policy, greatly reduces the load of server, reduce content obtaining time delay.
Description of the drawings
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below There is attached drawing needed in technology description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments described in application can also be obtained according to these attached drawings other for those of ordinary skill in the art Attached drawing.
Fig. 1 is the network scenarios hierarchy chart of the node of the application;
Fig. 2 is the structure chart of the application;
Fig. 3 is the structure chart of interest packet;
Fig. 4 is the structure chart of data packet;
Fig. 5 is the flow graph of triple and caching probability vector;
Fig. 6 is the method flow diagram of the cross-layer caching method and its device.
Fig. 7 is the joint structure figure of the application.
Specific implementation mode
In order to make those skilled in the art more fully understand the technical solution in the application, below in conjunction with the embodiment of the present application In attached drawing, technical solutions in the embodiments of the present application is clearly and completely described, it is clear that described embodiment is only Only it is some embodiments of the present application, instead of all the embodiments.Based on the embodiment in the application, ordinary skill The every other embodiment that personnel are obtained, shall fall within the protection scope of the present application.
Further illustrate that the application implements with reference to illustrations.
The application provides a kind of content center network cross-layer caching method of the proposition towards social networks.As shown in Figure 1, In social networks scene, a node is related to three layers:Physical layer, network layer and application layer.A section is described in detail below Three layers of composition of point:
One, physical layer
The physical equipment of node belongs to physical layer scope, identifies the physical layer attributes of node, such as the storage sky of equipment Between, the geographical coordinate of node, the probability density characteristics etc. of node.
Two, network layer
The importance of node position in a network:Core position or marginal position identify the network layer attribute of user, Such as the number of degrees, betweenness center, the degree of approach, mutual information, feature vector etc..
Three, application layer
The user of contact and node between application layer, social networks node social section differently composed to the hobby of content The unique social characteristics of point.Such as the request frequency of node, node is to the preference of content, content popularit, the priority of content Deng.
In network-caching, it is contemplated that each layer can influence the factor of caching, to make the caching of node more efficient.This In invention, the relationship between node and content, the characteristics of with CCN, attentinal contents itself rather than concern position are examined Preference of the user to social content for having considered node, cache policy is selected according to the caching characteristic of node.
The specific steps of this method are as shown in Figure 2 and Figure 6, including:
Step S1:Content requestor is initiated to carry the interest packet of request content name according to the demand of user;
In original interest packet content field addition triple (interest-degree, node betweenness, node replacement rate), wherein interest Degree is the caching affecting parameters proposed from application layer angle, and node betweenness is the caching affecting parameters proposed from network layer angle, Node replacement rate is the caching affecting parameters proposed from physical layer angle.The present invention combines under social networks scene presented above The three-decker of node has considered the caching factor of three levels and has proposed the definition of triple.
The generation of triple is described below:
One, interest-degree
Since the preference of user reflects Interest Measure of the user to content, it can reflect user's request indirectly The probability of one content, therefore calculate interest-degree of the user to content using the preference of user.Wherein preference of the user to content It is closely related with the type of content.Assuming that all the elements theme is content topic set in network.For some specific content ciThere is Q theme, i.e.,Content ciIn theme wkUnder attribute function be Pro (ci,wk).User is to every class master Topic also has the preference of oneself, user ujTo theme wkPreference preference function pref (uj,wk) indicate.Then user is internal The interest-degree of appearance indicates as follows:
Wherein,As content ciIncluding attribute wk, then Pro (ci,wk) it is 1, it is otherwise 0.
It is assumed that the preference function of user is indicated with mutual information, wherein p (X (wk)|Vj) it is user history information VjMiddle selection wkThe probability of subject content, p (X (wk)) it is w in the whole networkkThe probability of subject content.
Two, node betweenness
Different nodes has different feature and importance, node betweenness to be used for describing node important in a network in network Property.Betweenness is bigger, and node is more important in a network, it is meant that information can be transmitted to more nodes by this node, and betweenness is determined Justice is shown below.
Wherein, δs,tIndicate that all shortest diameters from node s to node t are total, and δs,t(uj) indicate node s to node t Shortest diameter in pass through ujThe number of node.
Three, node replacement rate
Important node is larger to the caching probability of content, and spatial cache is easy completely, new content to be caused to reach the time quickly Point needs constantly to substitute cache contents, and the replacement of more caching brings caching expense.Nodes are saved with caching Point cache constantly changes to performance, so the real-time caching replacement rate of node is also to influence one important finger of caching performance Mark.Node replacement rate is defined as follows:
Wherein, C (uj) indicate user ujThe buffer memory capacity of equipment, Sj(ci) indicate user ujUnit interval is in the i-th class The replacement size of appearance.
Since interest packet will carry the triplet information of node, the packet format of content center network is repaiied in the present invention Change, existing interest packet in the data format of interest packet as shown in figure 3, wherein add triplet sets.Content transmission process In, often pass through a node, interest packet ternary group field will be added just through the triple data of node in interest packet.
Step S2:Node receive interest packet, judge the node whether be can provide request content content provider or Cache hit node;If so, thening follow the steps S6;If not, thening follow the steps S3.
By searching for the content caching (CS) of the node, judge to whether there is the request content in content caching, if interior Holding in caching has requested content, then judges that the node is content provider or the cache hit section that can provide request content At this moment point abandons the interest packet, and executes step S6;It is no to then follow the steps S3.
Step S3:If there is no requested content in content caching, pending required list (PIT) is searched, is judged in PIT Whether the name of the request content is had, if so, S4 is thened follow the steps, it is no to then follow the steps S5.
The transmission path information that interest packet is stored in PIT, by searching for PIT, it is to be understood that the interest packet of the request content Whether in the node-node transmission mistake, if finding the name of the request content in PIT, step S4 is at this moment executed, otherwise Execute step S5.
Step S4:If there is the name of the request content in PIT, indicates that the interest packet once reached the node, then exist Increase the node port of interest packet arrival in the name entry of the request content of PIT, and abandon the interest packet, terminates the party Method.
Step S5:If not having the content item in PIT, the triple of the node is added for interest packet, is saved to other Interest packet after point forwarding addition triple, these nodes for having received interest packet continue to execute step S2, until finding can Content provider or the cache hit node of request content are provided.
The step specifically includes, node checks forwarding information table (FIB), and being stored in FIB can be forwarded to content requests Other nodal informations the triplet information of this node is added to interest if finding the node that can be forwarded in FIB Packet forwards interest packet according to the corresponding all nodes of the content in being recorded in FIB, and records the node of forwarding in PIT. These nodes for receiving interest packet continue to execute step S2, until finding the content provider that can provide request content or slow Deposit hit node.
Step S6:After interest packet reaches content provider or hit node, node is by the information extraction of interest packet, by ternary Group calculates the caching probability of each node with gray scale correlation fractal dimension, and the caching probability of all nodes constitutes slow in transmission path Probability vector is deposited, and the caching probability vector is added in data packet, the structure of data packet is as shown in Figure 4.
The gray scale correlation fractal dimension wherein used is to convert every dimension of all sequences to normalized comparison first Sequence selects a reference sequences then according to these sequences.Next, according to reference sequences and comparing sequence calculating ash Spend incidence coefficient.Finally, the grey-relational degree that each relatively sequence is calculated according to gray scale incidence coefficient, is closed according to these gray scales Connection degree carries out cache decision again.It is the first stage respectively including four-stage, gray scale is associated with generation phase, second stage, selection The reference sequences stage, calculates gray scale incidence coefficient stage, fourth stage at the phase III, and the gray scale for calculating each relatively sequence is closed Connection spends the stage.
Specifically include following sub-step:
Step P1:Sequence is compared in extraction triple generation
Node triple is made of application layer, network layer and physical layer caching affecting parameters, now defines each layer Caching affecting parameters are respectively a dimension attribute of node, then each node is made of multidimensional property, therefore has multidimensional category Property, the multidimensional property of reference sequences and comparison sequence is in different ranges, and the good and bad value condition of each dimension attribute is not Together, therefore for each dimensional attribute it to be normalized, then the attribute vector of i-th of optional sequence is Xi'=(xi′ (1),xi' (2), xi' (3)), wherein xi' (m) indicates performance number of i-th of node under m dimension attributes.It is acquired along path The triplet sets of node are that matrix is A '.
Since the value range of each attribute and performance difference, influence of each attribute to cache decision are also produced Difference has been given birth to, therefore the attribute of each dimension has been normalized.
Formula 6 is selected when the bigger caching gain of the ith attribute value of node is bigger, when node ith attribute value is smaller Formula 7 is selected when caching performance is better.Along path shown in the normalization triple such as publicity (8) of node.
Step P2:Define reference sequences
Reference sequences should be the most ideal situation in all relatively sequences.Above normalizing has been carried out compared with sequence by contrast Change is handled, and all relatively sequences are all normalized【0,1】In range, for xi(m) for, its distance 1 more closely turns out it Closer apart from optimal value in m dimensional features, i.e. the probability of node cache contents on the attribute is bigger.Therefore for one For node, when each dimension is close to optimal value, it should be that best caching places node.Therefore setting reference sequences For the ideal value of every dimension, i.e. X0=(x0(1),x0(2),…,x0(m))T=(1,1 ..., 1).
Step 3 calculates gray scale incidence coefficient
Gray scale incidence coefficient characterizes the similarity degree of optional sequence and reference sequences.xi(m) and x0(m) gray scale between Incidence coefficient is bigger, it was demonstrated that they are closer, it was demonstrated that the node should have the probability of bigger to take the caching content.Gray scale association system It is several to be defined as follows:
Wherein, μ is a resolution ratio.
Step 4 calculates grey-relational degree
The corresponding grey relational grade of each node is the weighting of the grey-relational degree of each dimension, i.e.,
Wherein, p (Xi,X0) indicate to compare sequence (nodes Xi) with the degrees of association of reference sequences (ideal cache node), ajTable Show the probability right value of jth dimension, andGray scale substantially characterizes the similarity of this node and ideal cache node. Since ideal cache node is ideal optimal situation, the bigger characterization of gray value of optional cache node is closer to reason Think that cache node, the i.e. probability of this nodal cache content are bigger.
Gray scale correlation fractal dimension is described above, continues the cross-layer caching method for introducing the application with reference to Fig. 2:
Step S7:Node checks PIT, judges whether the request content corresponds to a port, if only corresponding to a port S8 is thened follow the steps, it is no to then follow the steps S9.
Step S8:Data packet is during reverse path transmission, if recording the content in this node PIT only corresponds to one Port, the then probability that oneself should be cached in the querying node data packet, is cached with corresponding probability respectively, and to PIT The corresponding ports of middle record forward the data packet.
Step S9:The content is recorded in PIT and corresponds to multiple ports, i.e. interest packet was requested from multiple nodes, then this section Point deletes the caching probability vector field of data packet, adds the caching probability vector field of the different port of this node PIT, and The data packet is forwarded by the PIT corresponding ports recorded, until data packet reaches content requestor.
Fig. 5 shows interest packet and the process that data packet is generated and continuously transmitted, including:
1, the requestor of request content generates the triple (x in interest packet1(1),x1(2),x1(2))T;And by triple It is added in interest packet
2, the triple (x of routing node is added during searching content constantly in interest packeti(1),xi(2),xi (2))T, all triples of generation are all added in interest packet;
3, after finding content provider or the cache node of cache contents, according to triple calculate caching probability to Amount abandons interest packet;The caching probability vector of generation is added to data packet, and data packet is asked along reverse path to content The person of asking is transmitted;
4, during reverse path transmission, if the content only corresponds to a port in this node PIT records, this The probability that this node should cache in querying node data packet is cached with corresponding probability respectively;If this in PIT records is interior Hold corresponding multiple ports, then the caching probability vector field of data packet is updated to the caching probability vector of PIT respective nodes, and The data packet for adding respective nodes caching probability vector is forwarded to corresponding port, until reaching content requestor.
The cross-layer caching method that the application is described above in association with Fig. 1-6 describes the application interior joint with reference to Fig. 7 Construction, as shown in fig. 7, the node in the application includes:
Transceiver 701:Interest packet and data packet are received, and forwards interest packet and data packet.
Interest packet is made of triple (interest-degree, node betweenness, node replacement rate), and wherein interest-degree is from application layer The caching affecting parameters that angle proposes, node betweenness is the caching affecting parameters proposed from network layer angle, and node replacement rate is The caching affecting parameters proposed from physical layer angle.The present invention combines the three-layered node of social networks scene lower node presented above Structure has considered the caching factor of three levels and has proposed the definition of triple.
The generation of triple is described below:
One, interest-degree
Since the preference of user reflects Interest Measure of the user to content, it can reflect user's request indirectly The probability of one content, therefore calculate interest-degree of the user to content using the preference of user.Wherein preference of the user to content It is closely related with the type of content.Assuming that all the elements theme is content topic set in network.For some specific content ciThere is Q theme, i.e.,Content ciIn theme wkUnder attribute function be Pro (ci,wk).User is to every class master Topic also has the preference of oneself, user ujTo theme wkPreference preference function pref (uj,wk) indicate.Then user is internal The interest-degree of appearance indicates as follows:
Wherein,As content ciIncluding attribute wk, then Pro (ci,wk) it is 1, it is otherwise 0.
It is assumed that the preference function of user is indicated with mutual information, wherein p (X (wk)|Vj) it is user history information VjMiddle choosing It selectswkThe probability of subject content, p (X (wk)) it is w in the whole networkkThe probability of subject content.
Two, node betweenness
Different nodes has different feature and importance, node betweenness to be used for describing node important in a network in network Property.Betweenness is bigger, and node is more important in a network, it is meant that information can be transmitted to more nodes by this node, and betweenness is determined Justice is shown below.
Wherein, δs,tIndicate that all shortest diameters from node s to node t are total, and δs,t(uj) indicate node s to node t Shortest diameter in pass through ujThe number of node.
Three, node replacement rate
Important node is larger to the caching probability of content, and spatial cache is easy completely, new content to be caused to reach the time quickly Point needs constantly to substitute cache contents, and the replacement of more caching brings caching expense.Nodes are saved with caching Point cache constantly changes to performance, so the real-time caching replacement rate of node is also to influence one important finger of caching performance Mark.Node replacement rate is defined as follows:
Wherein, C (uj) indicate user ujThe buffer memory capacity of equipment, Sj(ci) indicate user ujUnit interval is in the i-th class The replacement size of appearance.
Since interest packet will carry the triplet information of node, the packet format of content center network is repaiied in the present invention Change, existing interest packet in the data format of interest packet as shown in figure 3, wherein add triplet sets.Content transmission process In, often pass through a node, interest packet ternary group field will be added just through the triple data of node in interest packet.
As shown in fig. 7, node further includes judgment means 702:Judge whether the node is can provide request content interior Hold supplier or cache hit node, if it is, triggering extraction element 705;If it is not, then device 704 is searched in triggering.
702 received data packet of judgment means judges content caching 703 by searching for the content caching (CS) 703 of the node In with the presence or absence of the request content judge that the node is can to provide to ask if there is requested content in content caching 703 Content provider or the cache hit node of content are asked, extraction element 705 is at this moment triggered, device 704 is searched in otherwise triggering.
Search device 704:Pending required list (PIT) can be inquired, judges whether there is the request content in pending required list Name;The transmission path information that interest packet is stored in PIT, by searching for PIT, it is to be understood that whether the interest packet of the request content In the node-node transmission mistake, if there is the name of the request content in PIT, indicate that the interest packet once reached the node, Then increase the node of interest packet arrival in the name entry of the request content of PIT, and abandons the interest packet;If PIT In there is no the content item, then the triple of the node is added for interest packet, by add triple after interest packet be sent to receipts Hair device is forwarded.
Node further includes extraction element 705, wherein if it is determined that device is judged as YES, then triggers extraction element 705.
Extraction element 705:Triple is calculated the slow of each node by the information for extracting interest packet with gray scale correlation fractal dimension Probability is deposited, the caching probability of all nodes constitutes caching probability vector in transmission path, and the caching probability vector is added to In data packet;Pending required list is searched, judges whether the request content corresponds to a node;If only corresponding to a node, The probability that this node should cache in inquiry data packet, is cached with corresponding probability, and remember into pending required list respectively The corresponding node of record forwards the data packet;If corresponding multiple nodes, this node is by the caching probability vector field of data packet It deletes, adds the caching probability vector field of the different nodes of the pending required list of this node, and to transceiver transmission data packet.
The gray scale correlation fractal dimension wherein used is to convert every dimension of all sequences to normalized comparison first Sequence selects a reference sequences then according to these sequences.Next, according to reference sequences and comparing sequence calculating ash Spend incidence coefficient.Finally, the grey-relational degree that each relatively sequence is calculated according to gray scale incidence coefficient, is closed according to these gray scales Connection degree carries out cache decision again.It is the first stage respectively including four-stage, gray scale is associated with generation phase, second stage, selection The reference sequences stage, calculates gray scale incidence coefficient stage, fourth stage at the phase III, and the gray scale for calculating each relatively sequence is closed Connection spends the stage.
Specifically include following sub-step:
Step P1:Sequence is compared in extraction triple generation
Node triple is made of application layer, network layer and physical layer caching affecting parameters, now defines each layer Caching affecting parameters are respectively a dimension attribute of node, then each node is made of multidimensional property, therefore has multidimensional category Property, the multidimensional property of reference sequences and comparison sequence is in different ranges, and the good and bad value condition of each dimension attribute is not Together, therefore for each dimensional attribute it to be normalized, then the attribute vector of i-th of optional sequence is Xi'=(xi′ (1),xi' (2), xi' (3)), wherein xi' (m) indicates performance number of i-th of node under m dimension attributes.It is acquired along path The triplet sets of node are that matrix is A '.
Since the value range of each attribute and performance difference, influence of each attribute to cache decision are also produced Difference has been given birth to, therefore the attribute of each dimension has been normalized.
Formula 6 is selected when the bigger caching gain of the ith attribute value of node is bigger, when node ith attribute value is smaller Formula 7 is selected when caching performance is better.Along path shown in the normalization triple such as publicity (8) of node.
Step P2:Define reference sequences
Reference sequences should be the most ideal situation in all relatively sequences.Above normalizing has been carried out compared with sequence by contrast Change is handled, and all relatively sequences are all normalized【0,1】In range, for xi(m) for, its distance 1 more closely turns out it Closer apart from optimal value in m dimensional features, i.e. the probability of node cache contents on the attribute is bigger.Therefore for one For node, when each dimension is close to optimal value, it should be that best caching places node.Therefore setting reference sequences For the ideal value of every dimension, i.e. X0=(x0(1),x0(2),…,x0(m))T=(1,1 ..., 1).
Step 3 calculates gray scale incidence coefficient
Gray scale incidence coefficient characterizes the similarity degree of optional sequence and reference sequences.xi(m) and x0(m) gray scale between Incidence coefficient is bigger, it was demonstrated that they are closer, it was demonstrated that the node should have the probability of bigger to take the caching content.Gray scale association system It is several to be defined as follows:
Wherein, μ is a resolution ratio.
Step 4 calculates grey-relational degree
The corresponding grey relational grade of each node is the weighting of the grey-relational degree of each dimension, i.e.,
Wherein, p (Xi,X0) indicate to compare sequence (nodes Xi) with the degrees of association of reference sequences (ideal cache node), ajTable Show the probability right value of jth dimension, andGray scale substantially characterizes the similarity of this node and ideal cache node.By It is ideal optimal situation in ideal cache node, therefore the bigger characterization of gray value of optional cache node is closer to ideal The probability of cache node, i.e. this nodal cache content is bigger.
It will be understood by those skilled in the art that embodiments herein can be provided as method, apparatus (equipment) or computer Program product.Therefore, in terms of the application can be used complete hardware embodiment, complete software embodiment or combine software and hardware Embodiment form.Moreover, the application can be used in one or more wherein include computer usable program code meter The computer journey implemented in calculation machine usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) The form of sequence product.
The application is flow chart of the reference according to method, apparatus (equipment) and computer program product of the embodiment of the present application And/or block diagram describes.It should be understood that each flow in flowchart and/or the block diagram can be realized by computer program instructions And/or the combination of the flow and/or box in box and flowchart and/or the block diagram.These computer programs can be provided to refer to Enable the processor of all-purpose computer, special purpose computer, Embedded Processor or other programmable data processing devices to generate One machine so that by the instruction that computer or the processor of other programmable data processing devices execute generate for realizing The device for the function of being specified in one flow of flow chart or multiple flows and/or one box of block diagram or multiple boxes.
These computer program instructions, which may also be stored in, can guide computer or other programmable data processing devices with spy Determine in the computer-readable memory that mode works so that instruction generation stored in the computer readable memory includes referring to Enable the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one box of block diagram or The function of being specified in multiple boxes.
These computer program instructions also can be loaded onto a computer or other programmable data processing device so that count Series of operation steps are executed on calculation machine or other programmable devices to generate computer implemented processing, in computer or The instruction executed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one The step of function of being specified in a box or multiple boxes.
Although the preferred embodiment of the application has been described, created once a person skilled in the art knows basic Property concept, then additional changes and modifications may be made to these embodiments.So it includes excellent that the following claims are intended to be interpreted as It selects embodiment and falls into all change and modification of the application range.Obviously, those skilled in the art can be to the application Various modification and variations are carried out without departing from spirit and scope.If in this way, these modifications and variations of the application Belong within the scope of the application claim and its equivalent technologies, then the application is also intended to exist comprising these modification and variations It is interior.

Claims (8)

1. a kind of cross-layer caching method of content center network, includes the following steps:
Step S1:Content requestor is initiated to carry the interest packet of request content name according to the demand of user;
Step S2:Node receives interest packet, judges whether the node is content provider or the caching that can provide request content Hit node;If not, thening follow the steps S3;
Step S3:Pending required list is searched, the name for whether having the request content in pending required list is judged, if there is then executing Step S4, it is no to then follow the steps S5;
Step S4:If there is the name of the request content in pending required list, indicate that the interest packet once reached the node, then Increase the node port of interest packet arrival in the name entry of the request content of pending required list, and abandons the interest Packet terminates this method;
Step S5:If there is no the content item in pending required list, the triple of the node is added for interest packet, to other Interest packet after node forwarding addition triple, these nodes for having received interest packet continue to execute step S2, until finding Content provider or the cache hit node of request content can be provided;
Wherein, if judging that the node is content provider or the cache hit node that can provide request content in step S2 When, then follow the steps S6;
Step S6:After interest packet reaches content provider or hit node, triple is used the information extraction of interest packet by node Gray scale correlation fractal dimension calculates the caching probability of each node, and it is general to constitute caching for the caching probability of all nodes in transmission path Rate vector, and the caching probability vector is added in data packet;
Step S7:The pending required list of node checks, judges whether the request content corresponds to a port, if only corresponding to an end Mouth thens follow the steps S8, no to then follow the steps S9;
Step S8:Data packet only corresponds to during reverse path transmission if recording the content in the pending required list of this node A port, the then probability that oneself should be cached in this querying node data packet, is cached with corresponding probability respectively, and to The corresponding ports recorded in pending required list forward the data packet;
Step S9:The content is recorded in pending required list and corresponds to multiple ports, i.e. content is requested by multiple nodes, then this node The caching probability vector field of data packet is deleted, the caching probability vector word of the different port of the pending required list of this node is added Section, and the corresponding ports recorded to pending required list forward the data packet, until data packet reaches content requestor.
2. the method as described in claim 1, wherein interest packet are by triple (interest-degree, node betweenness, node replacement rate) institute Composition, wherein interest-degree are the caching affecting parameters that the application layer angle residing for the node proposes, node betweenness is from node institute The caching affecting parameters that the network layer angle at place proposes, node replacement rate are the cachings that the physical layer angle residing for the node proposes Affecting parameters.
3. method as claimed in claim 2, wherein
Interest-degree is defined as follows:
Assuming that all the elements theme is content topic set in network, for some specific content ciThere is Q theme, i.e.,Content ciIn theme wkUnder attribute function be Pro (ci,wk);User has oneself to every class theme Preference, user ujTo theme wkPreference preference function pref (uj,wk) indicate, then interest-degree of the user to content It indicates as follows:
Wherein,As content ciIncluding attribute wk, then Pro (ci,wk) it is 1, it is otherwise 0,
It is assumed that the preference function of user is indicated with mutual information, wherein p (X (wk)|Vj) it is user history information VjMiddle selection wkIt is main Inscribe the probability of content, p (X (wk)) it is w in the whole networkkThe probability of subject content;
Node betweenness is defined as follows:
Different nodes has different a feature and importance in network, and node betweenness is used for describing node in a network important Property;Betweenness is bigger, and node is more important in a network, it is meant that information can be transmitted to more nodes by this node, and betweenness is determined Justice is shown below:
Wherein, δs,tIndicate that all shortest diameters from node s to node t are total, and δs,t(uj) indicate node s to the most short of node t Pass through u in diameterjThe number of node;
Node replacement rate is defined as follows:
Wherein, C (uj) indicate user ujThe buffer memory capacity of equipment, Sj(ci) indicate user ujUnit interval is for the i-th class content Replace size.
4. the method as described in claim 1, wherein gray scale correlation fractal dimension are to convert every dimension of all sequences first For normalized relatively sequence a reference sequences are selected then according to these sequences;Next, according to reference sequences and Compare sequence and calculates gray scale incidence coefficient;Finally, the grey-relational degree of each relatively sequence is calculated according to gray scale incidence coefficient, Cache decision is carried out again according to these grey-relational degrees.
5. a kind of node of content center network, including:
Transceiver:Interest packet and data packet are received, and forwards interest packet and data packet;
Judgment means:Judge whether the node is content provider or the cache hit node that can provide request content, if It is no, then trigger lookup device;
Search device:Pending required list can be inquired, judges the name for whether having the request content in pending required list;If pending The name for having the request content in required list indicates that the interest packet once reached the node, then this in pending required list is asked The node for increasing interest packet arrival in the name entry of content is sought, and abandons the interest packet;If do not had in pending required list The content item then adds the triple of the node for interest packet, by add triple after interest packet be sent to transceiver into Row forwarding;
Wherein, the node further includes extraction element, wherein if it is determined that device is judged as YES, then triggers extraction element,
Extraction element:Triple, is calculated the caching probability of each node by the information for extracting interest packet with gray scale correlation fractal dimension, The caching probability of all nodes constitutes caching probability vector in transmission path, and the caching probability vector is added to data packet In;Pending required list is searched, judges whether the request content corresponds to a port;If only corresponding to a port, number is inquired It according to the probability that this node in packet should cache, is cached respectively with corresponding probability, and by recording in pending required list Corresponding ports forward the data packet;If corresponding multiple ports, this node delete the caching probability vector field of data packet, Add the caching probability vector field of the different port of the pending required list of this node, and to transceiver transmission data packet.
6. node as claimed in claim 5, wherein interest packet are by triple (interest-degree, node betweenness, node replacement rate) institute Composition, wherein interest-degree are the caching affecting parameters that the application layer angle residing for the node proposes, node betweenness is from node institute The caching affecting parameters that the network layer angle at place proposes, node replacement rate are the cachings that the physical layer angle residing for the node proposes Affecting parameters.
7. node as claimed in claim 6, wherein
Interest-degree is defined as follows:
Assuming that all the elements theme is content topic set in network, for some specific content ciThere is Q theme, i.e.,Content ciIn theme wkUnder attribute function be Pro (ci,wk);User has oneself to every class theme Preference, user ujTo theme wkPreference preference function pref (uj,wk) indicate, then interest-degree of the user to content It indicates as follows:
Wherein,As content ciIncluding attribute wk, then Pro (ci,wk) it is 1, it is otherwise 0,
It is assumed that the preference function of user is indicated with mutual information, wherein p (X (wk)|Vj) it is user history information VjMiddle selection wkIt is main Inscribe the probability of content, p (X (wk)) it is w in the whole networkkThe probability of subject content;
Node betweenness is defined as follows:
Different nodes has different a feature and importance in network, and node betweenness is used for describing node in a network important Property;Betweenness is bigger, and node is more important in a network, it is meant that information can be transmitted to more nodes by this node, and betweenness is determined Justice is shown below:
Wherein, δs,tIndicate that all shortest diameters from node s to node t are total, and δs,t(uj) indicate node s to the most short of node t Pass through u in diameterjThe number of node;
Node replacement rate is defined as follows:
Wherein, C (uj) indicate user ujThe buffer memory capacity of equipment, Sj(ci) indicate user ujUnit interval is for the i-th class content Replace size.
8. node as claimed in claim 5, wherein gray scale correlation fractal dimension are to convert every dimension of all sequences first For normalized relatively sequence a reference sequences are selected then according to these sequences;Next, according to reference sequences and Compare sequence and calculates gray scale incidence coefficient;Finally, the grey-relational degree of each relatively sequence is calculated according to gray scale incidence coefficient, Cache decision is carried out again according to these grey-relational degrees.
CN201510729060.1A 2015-10-30 2015-10-30 A kind of the cross-layer caching method and its node of content center network Expired - Fee Related CN105262833B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510729060.1A CN105262833B (en) 2015-10-30 2015-10-30 A kind of the cross-layer caching method and its node of content center network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510729060.1A CN105262833B (en) 2015-10-30 2015-10-30 A kind of the cross-layer caching method and its node of content center network

Publications (2)

Publication Number Publication Date
CN105262833A CN105262833A (en) 2016-01-20
CN105262833B true CN105262833B (en) 2018-11-09

Family

ID=55102337

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510729060.1A Expired - Fee Related CN105262833B (en) 2015-10-30 2015-10-30 A kind of the cross-layer caching method and its node of content center network

Country Status (1)

Country Link
CN (1) CN105262833B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107196981A (en) * 2016-03-14 2017-09-22 华为技术有限公司 Access record retransmission method, equipment and system
CN106254444B (en) * 2016-07-29 2019-12-31 北京智芯微电子科技有限公司 Content caching method and device for content-centric network
CN106254446B (en) * 2016-07-29 2019-07-02 北京智芯微电子科技有限公司 A kind of caching laying method and device based on content center network
CN107070813B (en) * 2017-03-09 2019-11-19 中国科学院声学研究所 A kind of system and method for the content caching based on virtual network interface
CN108595475B (en) * 2018-03-12 2022-03-04 电子科技大学 Cache node selection method in mobile social network
CN109921997B (en) * 2019-01-11 2020-09-01 西安电子科技大学 Network caching method, cache and storage medium for named data
CN112039781B (en) * 2020-09-09 2022-09-09 北京同创神州航天科技有限公司 Named data network forwarding method based on flow control

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581011A (en) * 2012-07-19 2014-02-12 中兴通讯股份有限公司 Return path implementation method and return path implementation device in content-centric networking

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101978177B1 (en) * 2012-12-07 2019-08-28 삼성전자주식회사 Method of caching contents by node and method of transmitting contents by contents provider in a content centric network

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581011A (en) * 2012-07-19 2014-02-12 中兴通讯股份有限公司 Return path implementation method and return path implementation device in content-centric networking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"内容中心网络网内缓存策略研究";崔现东;《中国优秀博士学位论文全文库》;20150415;I139-11 *

Also Published As

Publication number Publication date
CN105262833A (en) 2016-01-20

Similar Documents

Publication Publication Date Title
CN105262833B (en) A kind of the cross-layer caching method and its node of content center network
CN104753797B (en) A kind of content center network dynamic routing method based on selectivity caching
Zhang et al. A survey of caching mechanisms in information-centric networking
CN104756449B (en) From the method for node and Content owner's transmission packet in content center network
CN106982248B (en) caching method and device for content-centric network
CN105519053B (en) The method and network node that dynamic interest for information centre's network forwards
CN105049254B (en) Data buffer storage replacement method based on content rating and popularity in a kind of NDN/CCN
CN104811493B (en) The virtual machine image storage system and read-write requests processing method of a kind of network aware
KR101942566B1 (en) Method for transmitting and caching information data in secure surveilance network, recordable medium, apparatus for caching information data in secure surveilance network, and secure surveilance network system
Dutta et al. Caching scheme for information‐centric networks with balanced content distribution
CN107682466A (en) The regional information searching method and its device of IP address
CN111107000B (en) Content caching method in named data network based on network coding
CN109905480A (en) Probability cache contents laying method based on content center
CN103905538A (en) Neighbor cooperation cache replacement method in content center network
CN111294394B (en) Self-adaptive caching strategy method based on complex network junction
CN110233901A (en) A kind of content center network caching method and system
CN110417662A (en) A kind of name data network transmission method towards wisdom building
CN105657006A (en) First visit acceleration method and system based on Internet acceleration network
WO2020181820A1 (en) Data cache method and apparatus, computer device and storage medium
Bastos et al. A forwarding strategy based on reinforcement learning for Content-Centric Networking
Lal et al. A popularity based content eviction scheme via betweenness-centrality caching approach for content-centric networking (CCN)
CN108521373B (en) Multipath routing method in named data network
CN108093056A (en) Information centre's wireless network virtualization nodes buffer replacing method
CN108183867A (en) Information centre's network node buffer replacing method
Chand A comparative survey on different caching mechanisms in named data networking (NDN) architecture

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181109

Termination date: 20201030

CF01 Termination of patent right due to non-payment of annual fee