CN110096458B - Named data network content storage pool data retrieval method based on neural network - Google Patents

Named data network content storage pool data retrieval method based on neural network Download PDF

Info

Publication number
CN110096458B
CN110096458B CN201910260765.1A CN201910260765A CN110096458B CN 110096458 B CN110096458 B CN 110096458B CN 201910260765 A CN201910260765 A CN 201910260765A CN 110096458 B CN110096458 B CN 110096458B
Authority
CN
China
Prior art keywords
node
data
searching
name prefix
skip list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910260765.1A
Other languages
Chinese (zh)
Other versions
CN110096458A (en
Inventor
李卓
刘开华
周美丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201910260765.1A priority Critical patent/CN110096458B/en
Publication of CN110096458A publication Critical patent/CN110096458A/en
Application granted granted Critical
Publication of CN110096458B publication Critical patent/CN110096458B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/10Address translation
    • G06F12/1009Address translation using page tables, e.g. page table structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • G06F12/08Addressing or allocation; Relocation in hierarchically structured memory systems, e.g. virtual memory systems
    • G06F12/12Replacement control
    • G06F12/121Replacement control using replacement algorithms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/951Indexing; Web crawling techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The invention provides a method for searching a content storage pool of a named data network based on a neural network, wherein the adopted storage pool comprises the following steps: the system comprises an on-chip storage unit and an off-chip storage unit, wherein the on-chip storage unit uses a high-speed memory, a neural network model is deployed on a chip to realize uniform mapping of data names, and an improved bitmap is deployed to map data packets containing the same name prefix to the same bucket; the off-chip storage unit uses a low-speed memory, deploys a plurality of dynamic memories corresponding to the slots of the dynamic index unit of the improved bitmap to store the skip list information of each name prefix, and is used for guiding the next search of the Data packets with the same name prefix in the bidirectional skip list; and a bidirectional jump table structure is arranged. This patent presents a method for retrieving an Interest packet in said storage pool using a sub-name matching algorithm; and a method for retrieving Data packets in the storage pool by adopting an accurate name matching algorithm.

Description

Named data network content storage pool data retrieval method based on neural network
Technical Field
The invention belongs to the field of high-performance router structure design, and particularly aims at the problems of novel storage structure design and algorithm of a Content storage pool (Content storage) in a named data network forwarding plane.
Background
With the explosive growth of the scale of the internet, the continuous emergence of innovative technologies and computing modes accelerates the role transition of the internet from a communication channel to a data processing platform. In order to meet future service requirements of internet content, personalization, ultrahigh mobility, zero time delay, ultrahigh traffic density and the like, a named data network application which is characterized by content caching and faces communication contents is generated to thoroughly solve a plurality of problems brought by the current internet IP architecture.
The named data network can realize the content-oriented communication mode of the Internet by using the name data; and the buffer memory can be deployed in the routing node, so that the response time of the user for accessing the cache data is shortened, the content sharing in the real sense is realized, the network load is greatly reduced, and the network data transmission rate is effectively improved. And is therefore considered to be one of the most promising developments in the future internet architecture field.
However, named data networks also face a set of problems and challenges that need to be addressed[1]Problem of support for Content Store wire-speed handling, particularly in the routing data plane[2]. The table entry data of the routing table in the named data network is usually composed of numbers and characters and named by character strings with the characteristics of variable length and no boundary, so that the Content Store needs to be capable of storing data storage with the scale of millions. In addition, the Content Store is used as a temporary Content cache, and the capacity of the Content Store is limited, so the Content Store needs to be capable of efficiently compressing the storage data to reduce the storage consumption and timely replacing the cache with a newly inserted data packet to clear up the space. In addition, the name of the data packet has the characteristic of being opaque to the transmission network. In the forwarding plane, various applications in the named data network can use different name schemes according to their own requirements on the premise of complying with a uniform naming policy, and the Content Store is used for two types of data packets: the Interest (Interest) packet and the Data (Data) packet have differences in processing procedures, so that the Content Store needs to be capable of rapidly supporting different name Data retrieval algorithms under various name schemes for completing Content forwarding[2]
Reference documents:
L.Zhang et al.,“Named Data Networking,”ACM SIGCOMM Computer Communication Review,vol.44,no.3,pp.66-73,2014.
Z.Li,Y.Xu,B,Zhang,L.Yan,and K.Liu,“Packet Forwarding in Named Data Networking Requirements and Survey of Solutions,”IEEE Communications Surveys&Tutorials,DOI:10.1109/COMST.2018.2880444,2018.
disclosure of Invention
The invention aims to provide a retrieval method on the basis of providing a novel storage structure learning bitmap content storage pool. The invention optimizes the storage structure of the learning bitmap Content storage pool by combining the working characteristics of the Content storage pool, so that the retrieval speed can be improved on the basis of ensuring the retrieval efficiency, and meanwhile, the data retrieval of a data cache replacement strategy, all sub-name matching and accurate name matching name data is supported. The technical scheme is as follows:
a named data network content storage pool data retrieval method based on a neural network is disclosed, and the adopted storage pool comprises the following steps: the device comprises an on-chip storage unit and an off-chip storage unit, wherein the on-chip storage unit uses a high-speed memory, a neural network model is deployed on a chip to realize uniform mapping of data names, and an improved bitmap (D-bitmap) is deployed to map data packets containing the same name prefix into the same bucket; the off-chip storage unit uses a low-speed memory, a plurality of dynamic memories corresponding to the slots of the dynamic index unit of the improved bitmap are deployed on the off-chip storage unit, and skip list information of each name prefix is stored and used for guiding next search of Data packets with the same name prefix in a bidirectional skip list, so that the Data retrieval speed is improved; and a bidirectional jump table structure is arranged to store the storage position information of the data packet in the learning bitmap content storage pool, and each bidirectional jump table node is stored with a first-in first-out queue FIFO single pointer and a least recently used LRU double pointer, wherein,
the neural network model to achieve uniform mapping of data names is as follows:
firstly, a neural network collects samples for training, and a large number of Uniform Resource Locators (URLs) similar to a name data format of a named data network are used as sample data; secondly, calculating the cumulative distribution function F (x) value of the sample data as a label; and then training a back propagation neural network, learning a neural network model capable of reflecting the distribution condition of the index data, and finally inputting the name character string of the data name as input to train the neural network model to obtain a real value between 0 and 1, and multiplying the real value by the total number of the grooves of the improved bitmap to obtain a mapping label, namely realizing uniform mapping of the data name.
The data structure design of the bidirectional skip list is as follows:
the bidirectional skip list adopts a multilayer structure, each layer is composed of a bidirectional linked list, and skip list nodes are arranged in an increasing order according to the ID number of the name prefix. The skip list nodes are connected by FIFO single pointer and LRU double pointer, and each node stores ID, pointer (prev) pointing to forward node and pointer (next) information pointing to backward node.
The dynamic memory data structure is designed as follows:
the contents recorded in the dynamic memory are: name prefix, forward node (prev _ nodes), forward node address (next _ node _ addr), backward node (next _ nodes), backward node address (next _ node _ addr), and current node (current _ node). Wherein the name prefix is extracted from the packet name < name prefix, ID >; the forward node and the backward node are key node pairs which are turned when a certain node is searched, and the forward node address and the backward node address are node addresses; the current node is the node that has just recently been looked up.
The process of selecting the optimal bidirectional jump table searching node in the storage pool and determining the searching direction is as follows:
the method comprises the following steps of selecting an optimal bidirectional skip list searching node and determining a searching direction by utilizing node information in a dynamic memory, namely name prefixes, forward nodes, forward node addresses, backward nodes, backward node addresses and a current node:
for data j with the same name prefix, firstly comparing the ID of j with the ID of the current node, and if the ID is less than the ID of the current node, selecting a retracing node closest to the data j from the forward node as an optimal starting searching node according to the ID number; and if the ID is larger than the ID of the current node, selecting the return node closest to the data j from the backward nodes according to the ID number as the optimal starting search node. If the ID of the selected optimal starting node is smaller than the ID of j, searching backwards from the optimal starting node, otherwise, searching forwards from the optimal starting node, and updating the recorded node information in time in the data searching process;
searching an Interest packet in the storage pool by adopting a sub-name matching algorithm, wherein the step of searching the Interest packet once comprises the following steps:
step 1: inputting the name prefix and the ID of the Interest packet into the memory structure of the learning bitmap content memory pool.
Step 2, matching all the sub-names: all sub-names containing the name prefix are matched in the learning bitmap and the best matching name prefix is selected.
And step 3: calculating mapping labels: and the optimally matched name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label of the Interest packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: determining whether a best match name prefix exists in the improved bitmap: if the median of the slot pointed by the offset address is 0, it is proved that the best matching name prefix does not exist in the improved bitmap, step 10 is executed, otherwise, it is proved that the best matching name prefix exists, and step 6 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the optimal matching name prefix is subjected to bitmap training mapping.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: the name prefix and the ID number of data y having the same prefix as that of data x are input to the bidirectional skip list input, and the ID number of data y is extracted as ID 2.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the best matching Data packet exists in the bidirectional skip list: and (3) searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the best matching Data packet exists, executing the step 9, and otherwise, executing the step 10.
And step 9: outputting the best matching Data packet: the learning bitmap content storage pool outputs the best matching Data packet and proceeds to step 11.
Step 10: outputting the Data packet without the best match: the learning bitmap content storage pool outputs no best matching Data packet and proceeds to step 11.
Step 11: the search of the Interest packet in the learning bitmap content storage pool structure is ended.
And searching Data packets in the storage pool by adopting an accurate name matching algorithm, wherein the step of searching the Data packets once comprises the following steps:
step 1: inputting the name prefix and the ID of the Data packet into a learning bitmap content storage pool storage structure.
Step 2, matching precise names: and directly carrying out mapping operation on the name prefix in the learning bitmap.
And step 3: calculating mapping labels: and the name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label number of the Data packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: judging whether the name prefix exists in the improved bitmap: if the value of the slot pointed by the offset address is 0, it is proved that the name prefix does not exist in the modified bitmap, step 10 is executed, otherwise, it is proved that the name prefix exists, and step 6 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the name prefix is learned and mapped by the bitmap training.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: the name prefix and the ID number of data y having the same prefix as that of data x are input to the bidirectional skip list input, and the ID number of data y is extracted as ID 2.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data packet exists in the bidirectional skip list: and (3) searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the Data packet exists, executing the step 9, and otherwise, executing the step 10.
And step 9: outputting the Data packet: the learning bitmap content storage pool outputs the Data packet, and proceeds to step 11.
Step 10: outputting a Data-free packet: the learning bitmap content storage pool outputs that the Data packet does not exist, and proceeds to step 11.
Step 11: the retrieval of the Data packet in the learning bitmap content storage pool structure ends.
Searching the name data in the storage pool in two directions, wherein the step of searching the name data once comprises the following steps:
step 1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
Step 2: searching a head node: and according to the traditional jump table searching mode, searching from the highest layer of the head node in the two-way jump table until finding the id1 node, and returning the actual storage address of the data x.
And step 3: recording foldback node information: and all the information of the front foldback node and the rear foldback node of each layer of the jump table in the id1 searching process is recorded in a dynamic memory of the name prefix.
And 4, step 4: name prefix and ID of input data y: the name prefix and the ID number of data y having the same prefix as that of data x are input to the bidirectional skip list input, and the ID number of data y is extracted as ID 2.
And 5: judging whether the IDs are equal: if id1 is equal to id2, proving that data y is equal to data x, then step 10 is performed, otherwise step 6 is performed.
Step 6: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
And 7: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data packet exists: if the two-way skip list has the nodes with the same ID, the Data packet is proved to exist, and the step 9 is executed, otherwise, the Data packet is proved not to exist, and the step 10 is executed.
And step 9: outputting the actual storage address: the actual content of the Data packet is returned to the storage address, and the process proceeds to step 11.
Step 10: outputting the Data packet which does not exist: the Data packet which does not exist is output, and execution continues at step 11.
Step 11: the lookup of the data name in the bi-directional skip list structure ends.
Drawings
Fig. 1 is a block diagram of a novel memory structure learning bitmap content memory pool system according to the present invention.
Fig. 2 is a flowchart illustrating the operation of the learning bitmap content storage pool structure for retrieving an Interest packet according to the present invention.
FIG. 3 is a flowchart of the operation of learning the bitmap content storage pool storage structure for Data packet retrieval according to the present invention.
FIG. 4 is a flowchart of the operation of inserting Data packets into the learning bitmap content storage pool storage structure according to the present invention.
FIG. 5 is a flowchart of the operation of the learning bitmap content storage pool storage structure for Data packet deletion according to the present invention.
FIG. 6 is a block diagram illustrating the operation flow of looking up a data name in a bi-directional skip list according to the present invention.
Fig. 7 is a diagram of a bi-directional skip list data structure in a learning bitmap content storage pool storage structure according to the present invention.
Fig. 8 is a diagram of a bidirectional skip list node in the learning bitmap content storage pool storage structure according to the present invention.
FIG. 9 is a schematic diagram of a dynamic memory structure in a learning bitmap content memory pool storage structure.
Detailed Description
In the present invention, a named data network forwarding plane novel storage structure learns bitmap content storage pool design, as shown in fig. 1, the structure includes: an on-chip memory cell and an off-chip memory cell. The on-chip storage unit is provided with a high-speed memory, a neural network Model (NN Model) is deployed in the chip to realize uniform mapping of data names, and an improved bitmap (Dynamic-bitmap, D-bitmap) is deployed to realize mapping of data packets containing the same name prefix to the same bucket. The off-chip storage unit uses a low-speed memory on which a plurality of dynamic memories (Packet stores) corresponding to slots (slots) of a dynamic index unit of the modified bitmap are disposed to Store skip list information of each name prefix. And a bidirectional skip list structure is also deployed to Store the storage position information of the data packet in the Content Store, and each bidirectional skip list node is stored with a First-in First-out (FIFO) single pointer and a Least Recently Used (LRU) double pointer. Through the efficient cooperation among the data structures, the designed novel storage structure learning bitmap content storage pool can support data retrieval, insertion and deletion operations.
4. Searching an Interest packet by adopting a sub-name matching algorithm in a designed storage structure of a named data network forwarding plane learning bitmap content storage pool, wherein the step of searching the Interest packet once comprises the following steps:
step 1: inputting the name prefix and the ID of the Interest packet into the memory structure of the learning bitmap content memory pool.
Step 2, matching all the sub-names: all sub-names containing the name prefix are matched in the learning bitmap and the best matching name prefix is selected.
And step 3: calculating mapping labels: and the optimally matched name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label of the Interest packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: determining whether a best match name prefix exists in the improved bitmap: if the median of the slot pointed by the offset address is 0, it is proved that the best matching name prefix does not exist in the improved bitmap, step 10 is executed, otherwise, it is proved that the best matching name prefix exists, and step 6 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the optimal matching name prefix is subjected to bitmap training mapping.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: inputting the name prefix and the ID number of the data y with the same prefix as the data x into a bidirectional skip list input, extracting the ID number of the data y as ID2, and assuming that the input y is not searched in the bidirectional skip list structure corresponding to the name prefix for the first time.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the best matching Data packet exists in the bidirectional skip list: and (3) searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the best matching Data packet exists, executing the step 9, and otherwise, executing the step 10.
And step 9: outputting the best matching Data packet: the learning bitmap content storage pool outputs the best matching Data packet and proceeds to step 11.
Step 10: outputting the Data packet without the best match: the learning bitmap content storage pool outputs no best matching Data packet and proceeds to step 11.
Step 11: the search of the Interest packet in the learning bitmap content storage pool structure is ended.
5. The Data packet is searched in the designed storage structure of the named Data network forwarding plane learning bitmap content storage pool by adopting an accurate name matching algorithm, and the step of searching the Data packet once is as follows:
step 1: inputting the name prefix and the ID of the Data packet into a learning bitmap content storage pool storage structure.
Step 2, matching precise names: and directly carrying out mapping operation on the name prefix in the learning bitmap.
And step 3: calculating mapping labels: and the name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label number of the Data packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: judging whether the name prefix exists in the improved bitmap: if the value of the slot pointed by the offset address is 0, it is proved that the name prefix does not exist in the modified bitmap, step 10 is executed, otherwise, it is proved that the name prefix exists, and step 6 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the name prefix is learned and mapped by the bitmap training.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: inputting the name prefix and the ID number of the data y with the same prefix as the data x into a bidirectional skip list input, extracting the ID number of the data y as ID2, and assuming that the input y is not searched in the bidirectional skip list structure corresponding to the name prefix for the first time.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data packet exists in the bidirectional skip list: and (3) searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the Data packet exists, executing the step 9, and otherwise, executing the step 10.
And step 9: outputting the Data packet: the learning bitmap content storage pool outputs the Data packet, and proceeds to step 11.
Step 10: outputting a Data-free packet: the learning bitmap content storage pool outputs that the Data packet does not exist, and proceeds to step 11.
Step 11: the retrieval of the Data packet in the learning bitmap content storage pool structure ends.
6. Inserting Data packets into the designed storage structure of the named Data network forwarding plane learning bitmap content storage pool, wherein the step of inserting one Data packet each time is as follows:
step 1: inputting the name prefix and the ID of the Data packet into a learning bitmap content storage pool storage structure.
Step 2, matching precise names: and directly carrying out mapping operation on the name prefix in the learning bitmap.
And step 3: calculating mapping labels: and the name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label number of the Data packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: judging whether the name prefix exists in the improved bitmap: if the value of the slot pointed by the offset address is not 0, the name prefix is proved to exist in the modified bitmap, and step 6 is executed, otherwise, the name prefix is proved not to exist, and step 10 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the name prefix is learned and mapped by the bitmap training.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: inputting the name prefix and the ID number of the data y with the same prefix as the data x into a bidirectional skip list input, extracting the ID number of the data y as ID2, and assuming that the input y is not searched in the bidirectional skip list structure corresponding to the name prefix for the first time.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data packet exists in the bidirectional skip list: and searching a Data packet corresponding to the ID number in the bidirectional skip list, if the node with the same ID is not found, indicating that the Data packet does not exist, executing the step 9, otherwise, executing the step 11.
And step 9: inserting a bidirectional skip list: the Data packets are inserted into the bidirectional skip list in ascending order according to the ID number, and the name prefix is recorded in the dynamic memory, and the step 11 is continuously executed.
Step 10: creating a bidirectional skip list: and creating a bidirectional skip list for the name prefix, inserting the data into the skip list, recording the name prefix in a dynamic memory, and continuing to execute the step 11.
Step 11: the insertion of the Data packet in the learning bitmap content storage pool structure ends.
7. Deleting Data packets in the designed storage structure of the named Data network forwarding plane learning bitmap content storage pool, wherein the step of deleting each Data packet is as follows:
step 1: inputting the name prefix and the ID of the Data packet into a learning bitmap content storage pool storage structure.
Step 2, matching precise names: and directly carrying out mapping operation on the name prefix in the learning bitmap.
And step 3: calculating mapping labels: and the name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label number of the Data packet mapped on the improved bitmap.
And 4, step 4: calculate base and offset addresses: the mapping label is divided by the total slot amount of each barrel to obtain the barrel serial number of the name prefix, which is the base address, and the sequence number of the name prefix entering the barrel is the offset address of the name prefix.
And 5: judging whether the name prefix exists in the improved bitmap: if the value of the slot pointed by the offset address is 0, it is proved that the name prefix does not exist in the modified bitmap, step 10 is executed, otherwise, it is proved that the name prefix exists, and step 6 is executed.
Step 6: accessing the dynamic memory: and accessing the dynamic memory by the base address and the offset address obtained after the name prefix is learned and mapped by the bitmap training.
And 7: and searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction and searching the Data packet in the bidirectional skip list.
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
7-2: searching a head node: and according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8.
7-3: name prefix and ID of input data y: the name prefix and the ID number of data y having the same prefix as that of data x are input to the bidirectional skip list input, and the ID number of data y is extracted as ID 2.
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
7-5: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data exists in the bidirectional skip list: and (3) searching a Data packet corresponding to the ID number in the bidirectional skip list, if the node with the same ID is not found, indicating that the Data packet does not exist, executing the step 10, otherwise, executing the step 9.
And step 9: the Data packet node is deleted and execution continues with step 10.
Step 10: the deletion of the Data packet in the learning bitmap content storage pool structure ends.
8. The method comprises the following steps of searching name data in a two-way mode in a designed storage structure of a named data network forwarding plane learning bitmap content storage pool, wherein the name data are searched once:
step 1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip table, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip table structure corresponding to the name prefix for the first time.
Step 2: searching a head node: and according to the traditional jump table searching mode, searching from the highest layer of the head node in the two-way jump table until finding the id1 node, and returning the actual storage address of the data x.
And step 3: recording foldback node information: and all the information of the front foldback node and the rear foldback node of each layer of the jump table in the id1 searching process is recorded in a dynamic memory of the name prefix.
And 4, step 4: name prefix and ID of input data y: the name prefix and the ID number of data y having the same prefix as that of data x are input to the bidirectional skip list input, and the ID number of data y is extracted as ID 2.
And 5: judging whether the IDs are equal: if id1 is equal to id2, proving that data y is equal to data x, then step 10 is performed, otherwise step 6 is performed.
Step 6: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, the node with the smallest difference of ID values is selected as the best node from the front foldback nodes recorded in the dynamic memory, otherwise, the node with the smallest difference of ID values is selected as the best node from the rear foldback nodes.
And 7: searching a bidirectional skip list: and searching the nodes from the best according to the searching direction, and updating the foldback node information recorded in the dynamic memory.
And 8: judging whether the Data packet exists: if the two-way skip list has the nodes with the same ID, the Data packet is proved to exist, and the step 9 is executed, otherwise, the Data packet is proved not to exist, and the step 10 is executed.
And step 9: outputting the actual storage address: the actual content of the Data packet is returned to the storage address, and the process proceeds to step 11.
Step 10: outputting the Data packet which does not exist: the Data packet which does not exist is output, and execution continues at step 11.
Step 11: the lookup of the data name in the bi-directional skip list structure ends.
The invention realizes the rapid compression processing operation of data, and the key is that in the Content Store, the network mapping of a neural network is applied to improve the storage efficiency and support different name data retrieval algorithms, and a special bidirectional skip list data structure meeting the requirement of supporting a rapid cache replacement strategy and a dynamic memory data structure capable of guiding data retrieval are designed. The specific design and implementation scheme is as follows:
(1) bidirectional skip list data structure design for rapidly supporting cache replacement strategy
The data structure of the bidirectional skip list designed in the invention is improved on the basis of the traditional skip list, so that the bidirectional skip list is similar to the traditional skip list, a multilayer structure is also adopted, each layer is composed of a bidirectional linked list, skip list nodes are arranged in an increasing order according to the ID numbers of name prefixes, and the specific structure is shown in figure 5. Skip list nodes are connected by using a single FIFO pointer and a double LRU pointer, each node has an ID, a pointer (prev) pointing to a forward node and a pointer (next) pointing to a backward node, a node address, and a node structure diagram is shown in fig. 8, taking a node with an ID of 6 in a bidirectional skip list node diagram shown in fig. 7 as an example.
(2) Dynamic memory data structure design for guiding data retrieval
The dynamic memory structure is used for recording the information of the back-turning node before and after the node is searched in the bidirectional skip list so as to guide the next search of the same data name prefix. The contents recorded in the dynamic memory are: name prefix, forward node (prev _ nodes), forward node address (next _ node _ addr), backward node (next _ nodes), backward node address (next _ node _ addr), and current node (current _ node). Wherein the name prefix is extracted from the packet name < name prefix, ID >; the forward node and the backward node are key node pairs which are turned when a certain node is searched, and the forward node address and the backward node address are node addresses; the current node is the node that has just recently been looked up. The schematic structure is shown in fig. 9. If the searched node is h and X is a node in the bidirectional skip list, if X is less than or equal to X and less than or equal to X, the forward node records X and the backward node records X and less than or equal to X, the forward node address records X and less than or equal to X and the backward node address records X and less than or equal to X and less than or equal to X and X and H, the current node is h. The above procedure is illustrated by taking the data name/a/B/C/7 as an example of the first arrival of the bi-directional skip list data structure shown in fig. 7.
The extraction ID is 7 and the search path is head [4] -6[4] -NIL [4] -6[3] -25[3] -6[2] -9[2] -6[1] -7. At this time, the name prefix recorded in the dynamic memory is/a/B/C, the backward node is < NIL,25,9,7>, the corresponding backward node address is <0xB,0x9,0x4,0x3>, the forward node is <6,6,6,6>, the corresponding forward node address is <0x2,0x2,0x2,0x2>, and the current node is 7.
(3) The data name prefix is efficiently retrieved through a neural network model to obtain a mapping label
The prefix of the data name is efficiently retrieved through a neural network model, and the key point is to train and learn samples and construct the neural network model meeting the requirements. Firstly, a neural network collects a sample to train, for the sample, a large number of Uniform Resource Locators (URLs) similar to a name data format of a named data network are used as sample data, the sample data is utilized, a back propagation neural network is combined, a cumulative distribution function capable of reflecting the distribution condition of index data is learned and trained, and according to a mathematical theorem, if x obeys arbitrary distribution and is used as the input of the cumulative distribution function F (x), the distribution of values after y ═ F (x) transformation is bound to be U (0, 1). Therefore, the name prefix character string of the sample is taken as a numerical value, the cumulative distribution function is input, a mapping value between 0 and 1 can be obtained, and the value is multiplied by the total slot number of the improved bitmap, so that the mapping label number for realizing uniform mapping can be obtained.
(4) Fast two-way lookup in two-way skip list
The data search process with the same name prefix is guided by utilizing the node information in the dynamic memory, namely the name prefix, the forward node address, the backward node address and the current node, as follows: for data j with the same name prefix, firstly comparing the ID of j with the ID of the current node, and if the ID is less than the ID of the current node, selecting a retracing node closest to the data j from the forward node as an optimal starting searching node according to the ID number; and if the ID is larger than the ID of the current node, selecting the return node closest to the data j from the backward nodes according to the ID number as the optimal starting search node. If the ID of the selected optimal starting node is smaller than the ID of j, searching backwards, and otherwise, searching forwards from the optimal node. And in the data searching process, the recorded node information is updated in time. The above procedure is described by taking as an example the case where the data names/a/B/19,/a/B/26,/a/B/21 arrive at the bidirectional skip list structure shown in fig. 4, and the record information in the dynamic memory is shown in fig. 9.
When looking up/a/B/19, the name ID is extracted as 19, since 19>7, the optimal node and level number are looked up from the backward node. Judge 9<19<25, so from (19-9) > (25-19), so start the lookup from 25[3], i is 3, and 19<25, so look ahead. The path is as follows: 25[3] -6[3] -25[2] -17[2] -25[1] -21[1] -19; the update node information in the dynamic memory is as follows: the backward node is < NIL,25,25,21>, the corresponding backward node address is <0xB,0x9,0x9,0x8>, the forward node is <6,6,17,19>, the corresponding forward node address is <0x2,0x2,0x6,0x7>, and the current node is 19.
When looking up/a/B/26, it is proposed to extract the name ID as 26, and 19<26, so the optimal node and level number to look up is chosen from the backward nodes. It is decided that 25<26< NIL, and (26-25) < (NIL-26), so look-up starts at 25[2], i is 3, and 26>25, so look-up is backward. The path is as follows: 25[3] -NIL [3] -25[2] -NIL [2] -25[1] -26; the update node information in the dynamic memory is as follows: the backward node address is <0xB,0xB,0xB,0xA > corresponding to the backward node of < NIL, NIL, NIL,26>, the forward node address is <6,25,25,25>, the corresponding forward node address is <0x2,0x9,0x9,0x9>, and the current node is 26.
When looking up/a/B/C/21, it is proposed to extract the name ID as 21, and 26>21, so choose to look up the best node and level from the forward nodes. It is judged 6<21<25, and (25-21) < (21-6), so look-up starts at 25[3], i is 3, and 25>21, so look-up is backward. The path is as follows: 25[3] -6[3] -25[2] -17[2] -25[1] -21; the update node information in the dynamic memory is as follows: the backward node address is <0xB,0x9,0x9,0x9> corresponding to the backward node of < NIL,25,25 >, the forward node address is <6,6,17,21>, the corresponding forward node address is <0x2,0x2,0x6,0x8>, and the current node is 21.

Claims (1)

1. A named data network content storage pool data retrieval method based on a neural network is disclosed, and the adopted storage pool comprises the following steps: the device comprises an on-chip storage unit and an off-chip storage unit, wherein the on-chip storage unit uses a high-speed memory, a neural network model is deployed on a chip to realize uniform mapping of data names, and an improved bitmap (D-bitmap) is deployed to map data packets containing the same name prefix into the same bucket; the off-chip storage unit uses a low-speed memory, a plurality of dynamic memories corresponding to the slots of the dynamic index unit of the improved bitmap are deployed on the off-chip storage unit, and skip list information of each name prefix is stored and used for guiding next search of Data packets with the same name prefix in a bidirectional skip list, so that the Data retrieval speed is improved; and a bidirectional jump table structure is arranged to store the storage position information of the data packet in the learning bitmap content storage pool, and each bidirectional jump table node is stored with a first-in first-out queue FIFO single pointer and a least recently used LRU double pointer, wherein,
the neural network model to achieve uniform mapping of data names is as follows:
firstly, a neural network collects samples for training, and a large number of Uniform Resource Locators (URLs) similar to a name data format of a named data network are used as sample data; secondly, calculating the cumulative distribution function F (x) value of the sample data as a label; then, training a back propagation neural network, learning a neural network model capable of reflecting the distribution condition of index data, and finally inputting the name character string of the data name as input to train the neural network model to obtain a real numerical value between 0 and 1, and multiplying the real numerical value by the total number of the grooves of the improved bitmap to obtain a mapping label, namely realizing uniform mapping of the data name;
the data structure design of the bidirectional skip list is as follows:
the bidirectional skip list adopts a multilayer structure, each layer is composed of a bidirectional linked list, and skip list nodes are arranged in an increasing order according to the ID number of the name prefix; jump table nodes are connected by FIFO single pointer and LRU double pointers, and ID, pointer (prev) to forward node and pointer (next) information to backward node are stored in each node;
the dynamic memory data structure is designed as follows:
the contents recorded in the dynamic memory are: a name prefix, a forward node (prev _ nodes), a forward node address (next _ node _ addr), a backward node (next _ nodes), a backward node address (next _ node _ addr), and a current node (current _ node); wherein the name prefix is extracted from the packet name < name prefix, ID >; the forward node and the backward node are key node pairs which are turned when a certain node is searched, and the forward node address and the backward node address are node addresses; the current node is the node which is just searched recently;
the process of selecting the optimal bidirectional jump table searching node in the storage pool and determining the searching direction is as follows:
the method comprises the following steps of selecting an optimal bidirectional skip list searching node and determining a searching direction by utilizing node information in a dynamic memory, namely name prefixes, forward nodes, forward node addresses, backward nodes, backward node addresses and a current node:
for data j with the same name prefix, firstly comparing the ID of j with the ID of the current node, and if the ID is less than the ID of the current node, selecting a retracing node closest to the data j from the forward node as an optimal starting searching node according to the ID number; if the ID is larger than the ID of the current node, selecting a turning-back node closest to the data j from the backward nodes as an optimal starting searching node according to the ID number; if the ID of the selected optimal starting node is smaller than the ID of j, searching backwards from the optimal starting node, otherwise, searching forwards from the optimal starting node, and updating the recorded node information in time in the data searching process;
searching an Interest packet in the storage pool by adopting a sub-name matching algorithm, wherein the step of searching the Interest packet once comprises the following steps:
step 1: inputting name prefix and ID of an Interest packet, namely inputting the name prefix and ID of the Interest packet into a learning bitmap content storage pool storage structure;
step 2, matching all the sub-names: matching all sub-names containing the name prefix in the learning bitmap, and selecting the best matched name prefix;
and step 3: calculating mapping labels: the optimally matched name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of slots of the improved bitmap to obtain a mapping label of the Interest packet mapped on the improved bitmap;
and 4, step 4: calculate base and offset addresses: dividing the mapping label by the total slot amount of each barrel to obtain a barrel serial number of the name prefix, namely a base address, wherein the serial number of the name prefix entering the barrel is an offset address of the name prefix;
and 5: determining whether a best match name prefix exists in the improved bitmap: if the median value of the slot pointed by the offset address is 0, the best matching name prefix does not exist in the improved bitmap, and the step 10 is executed, otherwise, the best matching name prefix exists, and the step 6 is executed;
step 6: accessing the dynamic memory: accessing a dynamic memory by a base address and an offset address obtained after the optimal matching name prefix is subjected to bitmap learning training mapping;
and 7: searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction, and searching the Data packet in the bidirectional skip list;
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip list, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip list structure corresponding to the name prefix for the first time;
7-2: searching a head node: according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8;
7-3: name prefix and ID of input data y: inputting the name prefix and the ID number of data y with the same prefix as the data x into a bidirectional skip list for inputting, and extracting the ID number of the data y to be recorded as ID 2;
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, selecting the node with the smallest ID value difference from the front foldback nodes recorded in the dynamic memory as the best node, and otherwise, selecting the node with the smallest ID value difference from the rear foldback nodes as the best node;
7-5: searching a bidirectional skip list: searching the nodes from the best according to the searching direction, and updating the information of the turning-back nodes recorded in the dynamic memory;
and 8: judging whether the best matching Data packet exists in the bidirectional skip list: searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the best matching Data packet exists, executing the step 9, otherwise, executing the step 10;
and step 9: outputting the best matching Data packet: the learning bitmap content storage pool outputs the best matching Data packet, and continues to execute step 11;
step 10: outputting the Data packet without the best match: the learning bitmap content storage pool outputs the Data packet without the best match, and continues to execute step 11;
step 11: the search of the Interest packet in the learning bitmap content storage pool structure is finished;
and searching Data packets in the storage pool by adopting an accurate name matching algorithm, wherein the step of searching the Data packets once comprises the following steps:
step 1: inputting name prefix and ID of Data packet into the memory structure of learning bitmap content storage pool;
step 2, matching precise names: directly mapping the name prefix in a learning bitmap;
and step 3: calculating mapping labels: the name prefix obtains an index mapping value between 0 and 1 through neural network operation, and the value is multiplied by the total number of the slots of the improved bitmap to obtain a mapping label number of the Data packet mapped on the improved bitmap;
and 4, step 4: calculate base and offset addresses: dividing the mapping label by the total slot amount of each barrel to obtain a barrel serial number of the name prefix, namely a base address, wherein the serial number of the name prefix entering the barrel is an offset address of the name prefix;
and 5: judging whether the name prefix exists in the improved bitmap: if the median value of the slot pointed by the offset address is 0, the name prefix does not exist in the improved bitmap, and the step 10 is executed, otherwise, the name prefix exists, and the step 6 is executed;
step 6: accessing the dynamic memory: accessing a dynamic memory by a base address and an offset address obtained after the name prefix is subjected to learning bitmap training mapping;
and 7: searching the Data packet in the bidirectional skip list, namely selecting the optimal bidirectional skip list searching node, determining the searching direction, and searching the Data packet in the bidirectional skip list;
7-1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip list, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip list structure corresponding to the name prefix for the first time;
7-2: searching a head node: according to the searching mode of the traditional jump table, searching backwards from the highest layer of the head node in the two-way jump table, recording the searched path node information in a dynamic memory, and continuing to the step 8;
7-3: name prefix and ID of input data y: inputting the name prefix and the ID number of data y with the same prefix as the data x into a bidirectional skip list for inputting, and extracting the ID number of the data y to be recorded as ID 2;
7-4: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, selecting the node with the smallest ID value difference from the front foldback nodes recorded in the dynamic memory as the best node, and otherwise, selecting the node with the smallest ID value difference from the rear foldback nodes as the best node;
7-5: searching a bidirectional skip list: searching the nodes from the best according to the searching direction, and updating the information of the turning-back nodes recorded in the dynamic memory;
and 8: judging whether the Data packet exists in the bidirectional skip list: searching a Data packet corresponding to the ID number in the bidirectional skip list, if a node with the same ID is found, indicating that the Data packet exists, executing the step 9, otherwise, executing the step 10;
and step 9: outputting the Data packet: the Data packet is output from the learning bitmap content storage pool, and step 11 is continuously executed;
step 10: outputting a Data-free packet: the learning bitmap content storage pool outputs the Data packet which does not exist, and the step 11 is continuously executed;
step 11: the retrieval of the Data packet in the learning bitmap content storage pool structure is finished;
searching the name data in the storage pool in two directions, wherein the step of searching the name data once comprises the following steps:
step 1: inputting the name prefix and the ID of the data x, namely inputting the name prefix and the ID number of the data x into a bidirectional skip list, wherein the ID of the data x is marked as ID1, and the ID1 is supposed to be searched in the bidirectional skip list structure corresponding to the name prefix for the first time;
step 2: searching a head node: according to the searching mode of the traditional jump table, searching from the highest layer of the head node in the two-way jump table till finding the id1 node, and returning the actual storage address of the data x;
and step 3: recording foldback node information: the information of the front foldback node and the rear foldback node of each layer of the skip list in the id1 searching process is completely recorded in a dynamic memory of the name prefix;
and 4, step 4: name prefix and ID of input data y: inputting the name prefix and the ID number of data y with the same prefix as the data x into a bidirectional skip list for inputting, and extracting the ID number of the data y to be recorded as ID 2;
and 5: judging whether the IDs are equal: if id1= id2, proving that the data y is equal to the data x, executing step 10, otherwise executing step 6;
step 6: selecting the best searching starting node and searching direction: if ID1 is larger than ID2, selecting the node with the smallest ID value difference from the front foldback nodes recorded in the dynamic memory as the best node, and otherwise, selecting the node with the smallest ID value difference from the rear foldback nodes as the best node;
and 7: searching a bidirectional skip list: searching the nodes from the best according to the searching direction, and updating the information of the turning-back nodes recorded in the dynamic memory;
and 8: judging whether the Data packet exists: if the two-way skip list has the nodes with the same ID, the Data packet is proved to exist, and the step 9 is executed, otherwise, the Data packet is proved not to exist, and the step 10 is executed;
and step 9: outputting the actual storage address: returning to the storage address of the actual content of the Data packet, and continuing to execute the step 11;
step 10: outputting the Data packet which does not exist: outputting the Data packet which does not exist, and continuing to execute the step 11;
step 11: the lookup of the data name in the bi-directional skip list structure ends.
CN201910260765.1A 2019-04-02 2019-04-02 Named data network content storage pool data retrieval method based on neural network Active CN110096458B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910260765.1A CN110096458B (en) 2019-04-02 2019-04-02 Named data network content storage pool data retrieval method based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910260765.1A CN110096458B (en) 2019-04-02 2019-04-02 Named data network content storage pool data retrieval method based on neural network

Publications (2)

Publication Number Publication Date
CN110096458A CN110096458A (en) 2019-08-06
CN110096458B true CN110096458B (en) 2022-03-01

Family

ID=67444274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910260765.1A Active CN110096458B (en) 2019-04-02 2019-04-02 Named data network content storage pool data retrieval method based on neural network

Country Status (1)

Country Link
CN (1) CN110096458B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110851658B (en) * 2019-10-12 2023-05-05 天津大学 Tree index data structure, content storage pool, router and tree index method
CN113220683A (en) * 2021-05-08 2021-08-06 天津大学 Content router PIT structure supporting flooding attack detection and data retrieval method thereof
CN114257536B (en) * 2021-11-05 2023-09-01 浙江木链物联网科技有限公司 Industrial data acquisition method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364237A (en) * 2008-09-05 2009-02-11 成都市华为赛门铁克科技有限公司 Multi-keyword matching method and device
CN107908357A (en) * 2017-10-13 2018-04-13 天津大学 Name data network Forwarding plane PIT storage organizations and its data retrieval method
CN109271390A (en) * 2018-09-30 2019-01-25 天津大学 Index data structure based on neural network and data retrieval method thereof

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10404592B2 (en) * 2017-03-24 2019-09-03 Cisco Technology, Inc. System and method to facilitate content forwarding using bit index explicit replication (BIER) in an information-centric networking (ICN) environment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101364237A (en) * 2008-09-05 2009-02-11 成都市华为赛门铁克科技有限公司 Multi-keyword matching method and device
CN107908357A (en) * 2017-10-13 2018-04-13 天津大学 Name data network Forwarding plane PIT storage organizations and its data retrieval method
CN109271390A (en) * 2018-09-30 2019-01-25 天津大学 Index data structure based on neural network and data retrieval method thereof

Also Published As

Publication number Publication date
CN110096458A (en) 2019-08-06

Similar Documents

Publication Publication Date Title
CN110096458B (en) Named data network content storage pool data retrieval method based on neural network
US6775281B1 (en) Method and apparatus for a four-way hash table
CN103238145B (en) High-performance in network is equipped, the renewable and method and apparatus of Hash table that determines
CN102307149B (en) IP (internet protocol) lookup method and device and route updating method and device
CN109271390B (en) Index data structure based on neural network and data retrieval method thereof
CN1316390C (en) Performance and memory bandwidth utilization for tree searches using tree fragmentation
US9871727B2 (en) Routing lookup method and device and method for constructing B-tree structure
CN101655861B (en) Hashing method based on double-counting bloom filter and hashing device
CN105224692A (en) Support the system and method for the SDN multilevel flow table parallel search of polycaryon processor
KR20190117001A (en) Merge Tree Modifications for Maintenance Operations
CN103428093A (en) Route prefix storing, matching and updating method and device based on names
CN110109616B (en) Named data network content storage pool data deletion method based on neural network
WO2009076854A1 (en) Data cache system and method for realizing high capacity cache
CN102045412B (en) Method and equipment for carrying out compressed storage on internet protocol version (IPv)6 address prefix
CN110460529B (en) Data processing method and chip for forwarding information base storage structure of content router
CN103051543A (en) Route prefix processing, lookup, adding and deleting method
CN100426791C (en) Engine apparatus for route forwarding table address searching
CN104915148A (en) System and method for efficient content caching in a streaming storage
CN110196938B (en) Named data network content storage pool data insertion method based on neural network
CN102736986A (en) Content-addressable memory and data retrieving method thereof
CN100397816C (en) Method for classifying received data pocket in network apparatus
CN110138661A (en) Name data network content storage pool neural network based
CN102984071B (en) Method for organizing routing table of segment address route and method for checking route
CN110851658B (en) Tree index data structure, content storage pool, router and tree index method
CN115967675A (en) Method for establishing and searching NDN routing table based on bloom filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 300452 Binhai Industrial Research Institute Campus of Tianjin University, No. 48 Jialingjiang Road, Binhai New Area, Tianjin

Patentee after: Tianjin University

Address before: 300072 Tianjin City, Nankai District Wei Jin Road No. 92

Patentee before: Tianjin University