CN111611348A - ICN network information name searching method based on learning bloom filter - Google Patents

ICN network information name searching method based on learning bloom filter Download PDF

Info

Publication number
CN111611348A
CN111611348A CN202010449138.5A CN202010449138A CN111611348A CN 111611348 A CN111611348 A CN 111611348A CN 202010449138 A CN202010449138 A CN 202010449138A CN 111611348 A CN111611348 A CN 111611348A
Authority
CN
China
Prior art keywords
bloom filter
information
information name
name
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010449138.5A
Other languages
Chinese (zh)
Inventor
郑瑞娟
吴庆涛
张明川
朱军龙
王倩玉
李美雯
李冰
李�昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan University of Science and Technology
Original Assignee
Henan University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan University of Science and Technology filed Critical Henan University of Science and Technology
Priority to CN202010449138.5A priority Critical patent/CN111611348A/en
Publication of CN111611348A publication Critical patent/CN111611348A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/31Indexing; Data structures therefor; Storage structures
    • G06F16/316Indexing structures
    • G06F16/325Hash tables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L61/00Network arrangements, protocols or services for addressing or naming
    • H04L61/30Managing network names, e.g. use of aliases or nicknames

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Computational Linguistics (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The invention relates to an ICN network information name searching method based on a learning bloom filter, wherein a searching structure of the searching structure is composed of a learning model and a backup bloom filter, the learning bloom filter structure and the backup bloom filter structure are jointly formed to search information names, the information names are searched by using the learning model, but a certain number of false negatives can be generated while the searching precision is improved, and in order to reduce the false negative rate to 0, the backup bloom filter is used for further searching. By means of the technical scheme, the method and the device have the advantages that machine learning is utilized, the Recurrent Neural Network (RNN) and the standard bloom filter are combined, the learning bloom filter is utilized to realize quick search of information names at each node, the accuracy of information name retrieval is improved, and the memory occupation amount is reduced.

Description

ICN network information name searching method based on learning bloom filter
Technical Field
The invention belongs to the technical field of Information-Centric Networking (ICN), and particularly relates to an ICN Information name searching method based on a learning bloom filter.
Background
The user requirements determine the network information model. The demand of internet users evolves from communication between hosts to repeated access of information from the hosts to the network. The user is interested in the information itself and not the storage location of the information. The existing internet system structure based on the TCP/IP can not solve the fundamental problems of expandability, dynamic property, safety controllability and the like of the current internet. Therefore, in order to solve the above problems and get rid of the constraint of the conventional architecture on the information, making the information a design center of the architecture, an information-centric networking (ICN) is proposed.
The ICN replaces the traditional address-centered network communication model, adopts the information-centered network communication model, the communication mode of the ICN is that the traditional host-to-host 'push' information access is replaced by the host-to-network 'pull' information access, the security mechanism is established on the information but not on the host, and the forwarding mechanism replaces the traditional store-and-forward route by the cache-and-forward route, thereby enabling the information to be transmitted efficiently.
The ICN abandons the traditional protocol stack structure taking IP as the waist and adopts the protocol stack structure taking information name as the core. It uses information name as network transmission identification, and IP address is not considered or only used as a kind of bottom localization transmission identification. Thus, it is not necessary for the user to be concerned with the network topology, and the requested information can be obtained by making a data request to the network as required. In the routing process, the route does not need to be routed to the original data acquisition data source, the cache information in the path can be used as the information source to directly respond to the data, and the transmission efficiency of the information is improved.
In an ICN, it is particularly important to accurately find information in order to implement routing of the information. Unlike the IP address in the IP network, the information name in the ICN network has the characteristics of diversity, indefinite length, and the like, and the number of information names is huge. Therefore, how to accurately find the required information name from the massive information names is a key technology of the research of the ICN network. The quick and accurate search of the information names influences the transmission efficiency of the route and also influences the efficiency of obtaining information by the user, the demand of the user on the information is increased rapidly in the current network environment, and the requirement on the efficiency of searching the information names is higher in the face of massive information.
Disclosure of Invention
In order to solve the problems of high false positive rate and large memory occupation amount of an information name searching method in an ICN (information-centric networking) network, the invention provides an information name searching method based on a learning bloom filter.
The purpose of the invention and the technical problem to be solved are realized by adopting the following technical scheme. The ICN network information name searching method based on the learning bloom filter comprises the following steps:
step 1: after receiving the interest packet, the routing node firstly searches the content required by the interest packet in the CS, if the content exists, the routing node destroys the interest packet and generates a data packet which is returned to the request node along the reverse direction of the interest packet;
step 2: if the requested content does not exist in the CS, searching in the PIT, if the requested content exists, indicating that the routing node has forwarded the same content request, destroying the interest packet, and recording an interface receiving the interest packet behind a corresponding row of the PIT;
and step 3: if the PIT does not exist, searching in the FIB, if the PIT exists, forwarding the interest packet according to an interface of the FIB, and recording the information in the PIT; if the routing information does not exist in the FIB, the routing node does not know the route of the interest packet, and the interest packet is destroyed;
the information names are looked up in the PIT, FIB and CS by a lookup structure, wherein the lookup structure consists of a learning model and a backup bloom filter,
the learning model is used for predicting whether the information name is in the learning model, and the prediction process is as follows:
step A1: first, using D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈ U, where K represents a set of informative celebrities and U represents a set of non-informative celebrities;
step A2: the information name x passing through an activation function
Figure BDA0002506880950000021
To predict the probability of being in the information set, f (x) ranges from 0, 1];
Step A3: inputting an information name to be searched, if the output probability of the current sample is predicted to be 1, the information name belongs to an information name element set, and the information name to be searched is searched;
step A4: if the output probability of the current sample is predicted to be 0, the information name is not in the information name element set, and the information name to be searched cannot be searched;
the backup bloom filter uses a standard bloom filter as a second layer of the search structure, and aims to reduce the number of false negatives generated by the first layer through the search of the learning model to 0 through the search again, and the construction process is as follows:
step B1: with a set threshold τ, the false negative content generated using the learning model is built into a set: c ∈ K | f (x) < τ };
step B2: establishing a hash function
Figure BDA0002506880950000035
Replacing a plurality of hash functions used by a standard bloom filter, and reducing conflicts among contents, wherein | K | ═ m, and f (x) is an activation function in a learning model;
step B3: establishing a standard bloom filter with the size of mbit;
the steps of searching the information content name in the PIT, the FIB and the CS through the searching structure are as follows:
step S1, inputting a content name element set K and a non-content name element set U, and setting a threshold value tau;
step S2, calling the recurrent neural network architecture, using the sets K and U, to train a set D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈U};
Step S3, for any (x, y) ∈ D, according to the activation function
Figure BDA0002506880950000031
Calculating the value of f (x);
step S4, if x belongs to K and f (x) is equal to or more than tau, then outputting x, matching with the hash table by using the longest prefix matching algorithm, and returning a data packet or transmitting x to the corresponding next hop after matching is completed;
step S5, if x is equal to K but f (x) < tau, then transferring the content element x to the backup bloom filter and searching again in the backup bloom filter;
step S6, by backing up the learning hash function in the bloom filter
Figure BDA0002506880950000032
Generating a hash coefficient, searching corresponding bit bits in a bloom filter, outputting an information name x if the number of the bit is 1, matching the information name x with a hash table by using a longest prefix matching algorithm, and returning a data packet or transmitting the information content name x to a corresponding next hop after matching is finished;
step S7, if the number of the hash value found out in the corresponding bit of the bloom filter is 0, it indicates that no information name x is found in the whole finding process, i.e. the information does not exist, and then the requested interest packet is discarded.
In step B3, the step of inserting the information name after the backup bloom filter is established is as follows:
step C1: inputting a content set K, a selected threshold value tau, the size m of a bloom filter and an inserted information name x;
step C2-for the inserted information name x ∈ K, first according to the activation function
Figure BDA0002506880950000033
Calculating a probability value;
step C3, if x ∈ K and the calculated probability value is less than the set threshold value tau, using the constructed hash function
Figure BDA0002506880950000034
Calculating a hash value corresponding to the information name;
step C4: and adding 1 to the element value of the backup bloom filter at the position corresponding to the hash value.
By means of the technical scheme, the problems that an information name searching method in an ICN (information center network) is high in false positive rate and large in memory occupation amount are solved, machine learning is utilized, a Recurrent Neural Network (RNN) is combined with a standard bloom filter, the learning bloom filter is utilized to achieve quick searching of information names at each node, accuracy of information name searching is improved, and the memory occupation amount is reduced.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understandable, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
Fig. 1 is a schematic structural diagram of a learning bloom filter in an embodiment of the present invention.
Fig. 2 is a flow chart of information name lookup in an embodiment of the invention.
Detailed Description
The technical solution of the present invention will be further described in detail with reference to the accompanying drawings and preferred embodiments.
Referring to fig. 1 and 2, for an ICN network, a user requests Information and needs to send an interest packet to a routing node, where the interest packet includes a content name of the requested Information, and when the routing node receives an interest packet sent by the user, the routing node needs to search for the Information name in a CS (content store, context), an FIB (forward Forwarding Table, Forwarding Information Base), and a PIT (pending request Table) at the router node.
When a routing node receives an interest packet, firstly searching the content required by the interest packet in the CS, if the content exists, destroying the interest packet and generating a data packet to be returned to a request node along with the reverse direction of the interest packet; if the CS does not exist, searching in the PIT, if the CS does not exist, the routing node indicates that the same content request has been forwarded by the routing node, destroying the interest packet, and recording an interface for sending the interest packet behind a corresponding row of the PIT; if the PIT does not exist, searching in the FIB, if the PIT exists, forwarding the interest packet according to an interface of the FIB, and recording the information in the PIT; if the routing information does not exist in the FIB, the routing node does not know the route of the interest packet, and the interest packet is destroyed.
Therefore, in an ICN network, fast lookup of information names is a critical factor affecting network performance. The functions and method steps of the various modules of the present invention are described in detail below.
(I) composition structure
Referring to fig. 1, the search structure in the ICN network information name search method of the present invention is composed of two parts, i.e. a learning model and a backup bloom filter, and the two parts together form the learning bloom filter structure for searching information names, and the learning model and the backup bloom filter are explained in detail as follows:
1. learning model
The learning model is composed of Recurrent Neural Networks (RNNs), and a plurality of hash functions are replaced by a machine learning model on the basis of a standard bloom filter, so that all information contents can be searched by the same learning model. For an information name x needing to be searched, whether the information name x is in the learning model can be predicted through the learning model, and the specific process of predicting is as follows:
(1) first, using D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈ U) to train a neural network, where K represents a set of names, U represents a set of non-names, xiFor any information name, y, to be looked upiIs the corresponding output value obtained by inputting an arbitrary information name in the neural network.
(2) The information name x passing through an activation function
Figure BDA0002506880950000051
To predict the probability of being in the content set, f (x) falls within the range of [0, 1%]。
(3) Inputting an information name q to be searched, if the output probability of the current sample is predicted to be 1, the content belongs to the content set, and the content to be searched is searched.
(4) If the output probability of the current sample is predicted to be 0, the content is not in the content set, and the corresponding content to be searched can not be searched.
Compared with the traditional searching structure, the learning model constructed by using machine learning can generate a certain number of false negatives while the searching precision is improved, and in order to reduce the false negative rate to 0, a threshold tau is set to determine whether the query content is false negative, which is divided into the following two cases:
when the probability f (x) of the searched information name generated by the learning model is larger than or equal to tau, the information name x is explained to be concentrated in the content and not reported;
② when the searched information name is generated by the learning model with the probability f (x) < tau, but the information name is in the content set K, thus generating false negative.
Therefore, to reduce the false negatives to 0, we have designed to use a backup bloom filter for further lookups.
2. Backup bloom filter
The backup bloom filter uses a standard bloom filter as a second layer of the search structure, and aims to reduce false negatives generated by the first layer through the search of the learning model to 0 through the search again, and the specific process of construction is as follows:
(1) establishing a set of false negative contents generated by the first layer by using a learning model by using a set threshold value tau: c ∈ K | f (x) < τ }.
(2) Establishing a hash function
Figure BDA0002506880950000052
Multiple hash functions are used instead of standard bloom filters to reduce conflicts between contents. Where, | K | ═ m, f (x) is an activation function in the learning model.
(3) A standard bloom filter of mbit size is established.
After the backup bloom filter build is complete, we do the insertion of the information name. A backup bloom filter is a two-dimensional array, each element has an initial value of 0, and in order to represent an information content name in the backup bloom filter, it is necessary to map the name into an integer having a value of [0, n) by a hash function, and then add 1 to the value of the corresponding bloom filter element. The insertion of the information name is as follows:
the first step is as follows: inputting a content set K, a selected threshold value tau, the size m of a bloom filter and an inserted information name x;
second step, for the inserted information name x ∈ K, first according to the activation function
Figure BDA0002506880950000061
Calculating a probability value;
thirdly, if x ∈ K and the calculated probability value is less than the set threshold value tau, utilizing the constructed hash function
Figure BDA0002506880950000062
Calculating a hash value corresponding to the information name;
the fourth step: and adding 1 to the element value of the backup bloom filter at the position corresponding to the hash value.
Through the steps, the name elements in the information name set are mapped to the bit positions calculated correspondingly through the learning hash function, the structure can greatly reduce the occupation amount of the memory and the conflict among the name elements caused by the hash function, and the false positive rate is further reduced.
(II) search method
In an ICN (information communication network), in order to realize the quick transmission of information, the quick route forwarding is of great importance, and in order to realize the quick route forwarding, the efficient searching operation is the basis, so the invention realizes the efficient searching of information names by utilizing the learning bloom filter.
At each routing node, the information name lookup operation is performed, and a flowchart of the lookup operation is shown in fig. 2. The specific process of searching for information names is as follows:
the first step is as follows: inputting an information name element set K and a non-information name element set U, and setting a threshold tau;
the second step is that: calling a recurrent neural network architecture, using the sets K and U, training out a set D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈U};
Third step, for any (x, y) ∈ D, according to
Figure BDA0002506880950000063
Calculating the value of f (x);
the fourth step: if x ∈ K and f (x) ≧ τ, then execute the fifth step;
the fifth step: outputting x, and matching with a hash table by using a longest prefix matching algorithm;
and a sixth step: after matching is completed, returning a data packet or transmitting x to a corresponding next hop;
the seventh step: if x ∈ K but f (x) < τ, then an eighth step is performed;
eighth step: transmitting the information name x to a backup bloom filter of a second-layer structure, and searching again in the backup bloom filter;
the ninth step: generating a hash coefficient by backing up a learning hash function in the bloom filter, searching corresponding bit bits in the bloom filter, and executing the eleventh step if the number of the bit bits is 1;
the tenth step: if the number of the generated hash coefficient found out in the corresponding bit in the bloom filter is 0, executing the thirteenth step;
the eleventh step: outputting an information name x, and matching with a hash table by using a longest prefix matching algorithm;
the twelfth step: after matching is completed, returning a data packet or transmitting the information name x to a corresponding next hop;
the thirteenth step: meaning that no name x has been found throughout, then the requested package of interest is discarded. Then turning to the fourteenth step;
the fourteenth step is that: the algorithm ends.
In an ICN, each route has three specific components, a forward forwarding table (FIB), a pending request table (PIT), and a Content Store (CS). The FIB is used to specify which packets can be forwarded over the interface; the PIT has all the content requests that have not been satisfied, which have been forwarded to the potential data source, but have not received a response; the CS is a cache memory that stores copies of content that have been found in the past in response to future content requests.
When a node looks up an information name, it sends an interest packet indicating the name of the desired content. At each hop, the forwarding decision depends on the result of the operation of looking up the requested content name in the FIB, PIT and CS tables. The following introduces the processing procedure of the router based on the present invention:
the first step is as follows: firstly, the information needed by the interest package is searched in the CS according to the information name searching process of the invention. If the CS has the matched content, the interest packet is destroyed, and a data packet is generated and sent to the request node along the reverse direction of the interest packet.
The second step is that: if the matched content does not exist in the CS, updating the PIT to track the arrived interest packet, wherein the PIT also adopts the information name searching process to search the information required by the interest packet, if the PIT exists, the routing node is proved to have forwarded the same content request, the interest packet is destroyed, and the interface for receiving the interest packet is recorded behind the corresponding line of the PIT.
The third step: if the PIT does not exist, the information name searching process is used for searching the FIB, when the matched content is searched in the PIB, the interest packet is forwarded according to an interface corresponding to the FIB, and the interest packet is forwarded to the next hop.
The fourth step: if the packet is not found in the FIB, the routing node is indicated to be unaware of the routing information of the interest packet, and the interest packet is destroyed.
Once the routing node has content that matches the request interest packet, the data packet will be returned to the requesting node in the reverse path that the interest packet activates.
The above description is only a preferred embodiment of the present invention, and any person skilled in the art can make any simple modification, equivalent change and modification to the above embodiments according to the technical essence of the present invention without departing from the scope of the present invention, and still fall within the scope of the present invention.

Claims (2)

1. An ICN network information name searching method based on a learning bloom filter comprises the following steps:
step 1: after receiving the interest packet, the routing node firstly searches the content required by the interest packet in the CS, if the content exists, the routing node destroys the interest packet and generates a data packet which is returned to the request node along the reverse direction of the interest packet;
step 2: if the requested content does not exist in the CS, searching in the PIT, if the requested content exists, indicating that the routing node has forwarded the same content request, destroying the interest packet, and recording an interface receiving the interest packet behind a corresponding row of the PIT;
and step 3: if the PIT does not exist, searching in the FIB, if the PIT exists, forwarding the interest packet according to an interface of the FIB, and recording the information in the PIT; if the routing information does not exist in the FIB, the routing node does not know the route of the interest packet, and the interest packet is destroyed;
the method is characterized in that: the information names are looked up in the PIT, FIB and CS by a lookup structure, wherein the lookup structure consists of a learning model and a backup bloom filter,
the learning model is used for predicting whether the information name is in the learning model, and the prediction process is as follows:
step A1: first, using D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈ U, where K represents a set of informative celebrities and U represents a set of non-informative celebrities;
step A2: the information name x passing through an activation function
Figure FDA0002506880940000011
To predict the probability of being in the information set, f (x) ranges from 0, 1];
Step A3: inputting an information name to be searched, if the output probability of the current sample is predicted to be 1, the information name belongs to an information name element set, and the information name to be searched is searched;
step A4: if the output probability of the current sample is predicted to be 0, the information name is not in the information name element set, and the information name to be searched cannot be searched;
the backup bloom filter uses a standard bloom filter as a second layer of the search structure, and aims to reduce the number of false negatives generated by the first layer through the search of the learning model to 0 through the search again, and the construction process is as follows:
step B1: with a set threshold τ, the false negative content generated using the learning model is built into a set: c ∈ K | f (x) < τ };
step B2: establishing a hash function
Figure FDA0002506880940000012
Replacing a plurality of hash functions used by a standard bloom filter, and reducing conflicts among contents, wherein | K | ═ m, and f (x) is an activation function in a learning model;
step B3: establishing a standard bloom filter with the size of m bits;
the steps of searching the information content name in the PIT, the FIB and the CS through the searching structure are as follows:
step S1, inputting a content name element set K and a non-content name element set U, and setting a threshold value tau;
step S2, calling the recurrent neural network architecture, using the sets K and U, to train a set D { (x)i,yi=1)|xi∈K}∪{(xi,yi=0)|xi∈U};
Step S3, for any (x, y) ∈ D, according to the activation function
Figure FDA0002506880940000021
Calculating the value of f (x);
step S4, if x belongs to K and f (x) is equal to or more than tau, then outputting x, matching with the hash table by using the longest prefix matching algorithm, and returning a data packet or transmitting x to the corresponding next hop after matching is completed;
step S5, if x is equal to K but f (x) < tau, then transferring the content element x to the backup bloom filter and searching again in the backup bloom filter;
step S6, by backing up the learning hash function in the bloom filter
Figure FDA0002506880940000022
Generating a hash coefficient, searching corresponding bit bits in a bloom filter, outputting an information name x if the number of the bit is 1, matching the information name x with a hash table by using a longest prefix matching algorithm, and returning a data packet or transmitting the information content name x to a corresponding next hop after matching is finished;
step S7, if the number of the hash value found out in the corresponding bit of the bloom filter is 0, it indicates that no information name x is found in the whole finding process, i.e. the information does not exist, and then the requested interest packet is discarded.
2. The ICN network information name lookup method based on learning bloom filter as claimed in claim 1 wherein: in step B3, the step of inserting the information name after the backup bloom filter is established is as follows:
step C1: inputting a content set K, a selected threshold value tau, the size m of a bloom filter and an inserted information name x;
step C2-for the inserted information name x ∈ K, first according to the activation function
Figure FDA0002506880940000023
Calculating a probability value;
step C3, if x ∈ K and the calculated probability value is less than the set threshold value tau, using the constructed hash function
Figure FDA0002506880940000024
Calculating a hash value corresponding to the information name;
step C4: and adding 1 to the element value of the backup bloom filter at the position corresponding to the hash value.
CN202010449138.5A 2020-05-25 2020-05-25 ICN network information name searching method based on learning bloom filter Pending CN111611348A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010449138.5A CN111611348A (en) 2020-05-25 2020-05-25 ICN network information name searching method based on learning bloom filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010449138.5A CN111611348A (en) 2020-05-25 2020-05-25 ICN network information name searching method based on learning bloom filter

Publications (1)

Publication Number Publication Date
CN111611348A true CN111611348A (en) 2020-09-01

Family

ID=72204973

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010449138.5A Pending CN111611348A (en) 2020-05-25 2020-05-25 ICN network information name searching method based on learning bloom filter

Country Status (1)

Country Link
CN (1) CN111611348A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230221864A1 (en) * 2022-01-10 2023-07-13 Vmware, Inc. Efficient inline block-level deduplication using a bloom filter and a small in-memory deduplication hash table

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150324410A1 (en) * 2014-05-07 2015-11-12 International Business Machines Corporation Probabilistically finding the connected components of an undirected graph
CN105260429A (en) * 2015-09-30 2016-01-20 河南科技大学 ICN network information name searching method based on multiple Bloom filters
CN107454142A (en) * 2017-06-29 2017-12-08 北京邮电大学 Non- the obstruction content buffering method and device of a kind of content router
US20180219790A1 (en) * 2017-01-27 2018-08-02 Futurewei Technologies, Inc. Scalable multicast for notification-driven content delivery in information centric networks
US20190158622A1 (en) * 2018-08-08 2019-05-23 Ravikumar Balakrishnan Information centric network for content data networks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150324410A1 (en) * 2014-05-07 2015-11-12 International Business Machines Corporation Probabilistically finding the connected components of an undirected graph
CN105260429A (en) * 2015-09-30 2016-01-20 河南科技大学 ICN network information name searching method based on multiple Bloom filters
US20180219790A1 (en) * 2017-01-27 2018-08-02 Futurewei Technologies, Inc. Scalable multicast for notification-driven content delivery in information centric networks
CN107454142A (en) * 2017-06-29 2017-12-08 北京邮电大学 Non- the obstruction content buffering method and device of a kind of content router
US20190158622A1 (en) * 2018-08-08 2019-05-23 Ravikumar Balakrishnan Information centric network for content data networks

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王倩玉等: "Learned Bloom-Filter for an Efficient Name Lookup in Information-Centric Networking", 《 2019 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC)》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230221864A1 (en) * 2022-01-10 2023-07-13 Vmware, Inc. Efficient inline block-level deduplication using a bloom filter and a small in-memory deduplication hash table

Similar Documents

Publication Publication Date Title
EP3320670B1 (en) Method and apparatus for pushing data in a content-centric networking (ccn) network
CN101820386B (en) Method and system for facilitating forwarding a packet in a content-centric network
US7436830B2 (en) Method and apparatus for wire-speed application layer classification of upstream and downstream data packets
Quan et al. TB2F: Tree-bitmap and bloom-filter for a scalable and efficient name lookup in content-centric networking
You et al. Dipit: A distributed bloom-filter based pit table for ccn nodes
CN102970150A (en) Extensible multicast forwarding method and device for data center (DC)
US20130107885A1 (en) Server-Side Load Balancing Using Parent-Child Link Aggregation Groups
US20070171911A1 (en) Routing system and method for managing rule entry thereof
CN103428093A (en) Route prefix storing, matching and updating method and device based on names
Lee et al. Name prefix matching using bloom filter pre-searching for content centric network
US9848059B2 (en) Content handling method, apparatus, and system
Hou et al. Bloom-filter-based request node collaboration caching for named data networking
US10587515B2 (en) Stateless information centric forwarding using dynamic filters
CN103873602A (en) Network resource naming method and generating device
US20170373974A1 (en) Method and system for interest groups in a content centric network
Woodrow et al. SPIN-IT: a data centric routing protocol for image retrieval in wireless networks
JP2004266837A (en) Packet classification apparatus and method using field level tree
Lee et al. Dual-load Bloom filter: Application for name lookup
KR101384794B1 (en) Message routing platform
CN111611348A (en) ICN network information name searching method based on learning bloom filter
CN108521373B (en) Multipath routing method in named data network
CN109495525B (en) Network component, method of resolving content identification, and computer-readable storage medium
Alhisnawi Forwarding information base design techniques in content-centric networking: a survey
WO2001078309A2 (en) A method and apparatus for wire-speed application layer classification of data packets
JP2004127074A (en) File retrieval method in p2p network, terminal, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200901

RJ01 Rejection of invention patent application after publication