CN101582813B - Distributed migration network learning-based intrusion detection system and method thereof - Google Patents

Distributed migration network learning-based intrusion detection system and method thereof Download PDF

Info

Publication number
CN101582813B
CN101582813B CN2009100230731A CN200910023073A CN101582813B CN 101582813 B CN101582813 B CN 101582813B CN 2009100230731 A CN2009100230731 A CN 2009100230731A CN 200910023073 A CN200910023073 A CN 200910023073A CN 101582813 B CN101582813 B CN 101582813B
Authority
CN
China
Prior art keywords
mrow
msub
sample
msubsup
samples
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2009100230731A
Other languages
Chinese (zh)
Other versions
CN101582813A (en
Inventor
缑水平
焦李成
王宇琴
田小林
王爽
马文萍
吴建设
慕彩红
冯静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN2009100230731A priority Critical patent/CN101582813B/en
Publication of CN101582813A publication Critical patent/CN101582813A/en
Application granted granted Critical
Publication of CN101582813B publication Critical patent/CN101582813B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Debugging And Monitoring (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)

Abstract

The invention discloses a distributed migration network learning-based intrusion detection system and a method thereof, and mainly solves the problems that the prior method has low efficiency in detection of some attack types and is difficult to search data again. The whole system comprises a network behavior record preprocessing module, an abnormality detection module and an abnormal behavior analyzing module. The network behavior record preprocessing module completes the quantification and normalization processing of a network behavior record; the abnormality detection module uses an abnormality detection learning machine to completes the classification and identification for an input record, determines whether the record is a normal behavior, and completes the detection if the record isa normal behavior or transmits the record to the abnormal behavior analyzing module if the record is an abnormal behavior; and the abnormal behavior analyzing module uses an abnormal behavior analyzi ng learning machine to carry out the classification and identification of the input records and outputs the attach type of the record. The system and the method have the advantages of using other existing resources to improve the detection rate for the prior attach types with low detection rate and avoiding searching the data again and can be used for network intrusion detection.

Description

Intrusion detection system and method based on distributed migration network learning
Technical Field
The invention belongs to the field of network security, and particularly relates to an intrusion detection system which can be used for intrusion detection in the aspect of information security.
Background
With the wide use of the Internet, more and more illegal attacks on computer networks threaten the security of information systems, and network security systems mainly managed by passive management, such as firewall, cannot attack, steal or destroy information caused by unauthorized operation of internal users and the like on backdoors of application layers, and the firewall is easy to be attacked, so that security problems occurring in the internal networks are often restrained. Intrusion detection systems IDS show increasing importance as an important complement to the network security protection tool "firewall". IDS is a security backdoor of firewalls that can protect against attacks from internal networks, deter hackers from intrusions, and prevent the spread of viruses. With audit records, the IDS is able to identify any unwanted activities, thereby limiting those activities and protecting the security of the system. In 1998, the MIT Lincoln laboratory developed intrusion detection system evaluation in cooperation with DARPA, and one of the tasks of the project was to provide data sets for intrusion detection including host logs and network traffic, and KDD CUP' 99 performed appropriate processing and feature extraction on the 9-week tcpdump data provided by DARPA as a standard intrusion detection data set.
In recent years, with the development of machine learning, new intrusion detection technologies, such as single-machine intrusion detection methods based on neural networks, bayesian networks, support vector machines, etc., have appeared continuously, but in the face of new environments and network security problems in new situations, the above intrusion detection technologies have the problems of high false alarm rate, poor adaptability, low degree of automatic response, low degree of intelligence, etc., so that the distributed algorithm becomes necessary to improve the detection speed and detection accuracy of the IDS. In 2006, the king army et al introduced the Boosting algorithm into a classifier network, integrated and used the classifier network with classifiers, and extended to a distributed environment, proposed a distributed network integration algorithm DNB, and obtained a classifier system with a stronger generalization ability through communication and cooperation between node classifiers. Because the DNB-based intrusion detection method, like other conventional machine learning methods, cannot train a better classifier model when there is little label data, and requires that training data and test data are independently and identically distributed, there are the following disadvantages:
1. the detection rate of the network behaviors of different attack types is unbalanced, and the detection rate of the network behaviors of certain attack types is low;
2. if the detection rate is to be improved, the user needs to collect data again and learn, and the task is expensive and takes time;
3. it is not possible to take advantage of the other available resources that exist to improve the network behavior detection rate for certain attack types.
Disclosure of Invention
The invention aims to overcome the defects of the intrusion detection method, introduces the migration learning into DNB, and provides an intrusion detection system based on distributed migration network learning and a method thereof, so as to guide the learning of network behaviors with lower detection rate by using other existing data, thereby improving the detection rate.
To achieve the above object, the detection system of the present invention comprises:
the network behavior record preprocessing module is used for completing quantization and normalization preprocessing on the collected network behavior records and transmitting the preprocessed result to the abnormality detection module;
the abnormal detection module is used for classifying and identifying the input record by adopting an abnormal detection learning machine, determining whether the record belongs to a normal behavior, if the record belongs to the normal behavior, not processing the record, and ending the detection, otherwise, transmitting the record to the abnormal behavior analysis module;
and the abnormal behavior analysis module is used for classifying and identifying the input abnormal records by adopting an abnormal behavior analysis learning machine and outputting the attack types of the records.
The network behavior record preprocessing module comprises:
the existing record preprocessing submodule is used for completing quantization and normalization processing on an existing labeled network behavior record set and transmitting parameters after quantization and normalization processing into the new record preprocessing submodule;
and the new record preprocessing submodule is used for quantizing and normalizing the new network behavior record by utilizing the parameters transmitted by the existing record preprocessing submodule.
The abnormality detection module includes:
the abnormity detection learning submodule divides the preprocessed existing labeled network behavior record set into a normal type and an abnormal type, respectively and randomly extracts partial samples from the abnormal and abnormal labeled network behavior record set, adopts a distributed network integration learning algorithm to learn, generates an abnormity detection learning machine, and transmits the learning machine to the abnormity detection testing submodule;
and the anomaly detection testing sub-module is used for classifying and identifying the input preprocessed new record by adopting an anomaly detection learning machine, if the output result is normal, the input new record is not processed, the detection is finished, and otherwise, the input new record is transmitted to the anomaly behavior analysis module.
The abnormal behavior analysis module comprises:
a migration sample pre-selection submodule, which sets a source domain sample and a target domain labeled sample for the existing labeled network behavior record, completes pre-selection on the source domain sample according to the target domain to-be-guided sample, and inputs the selected source domain migration sample into an abnormal behavior analysis learning submodule;
the abnormal behavior analysis learning submodule is used for taking the input source domain migration sample and the target domain labeled sample as training samples together, learning by adopting a distributed network integrated learning algorithm introduced with migration learning, and generating an abnormal behavior analysis learning machine;
and the abnormal behavior analysis and test submodule is used for classifying and identifying the input abnormal behavior by adopting an abnormal behavior analysis learning machine and outputting the attack type of the input abnormal behavior.
In order to achieve the above object, the detection method of the present invention comprises the following steps:
(1) inputting an existing labeled network behavior record set X, and carrying out quantization and normalization pretreatment on the data set to obtain a pretreated result X';
(2) dividing the result X' after the preprocessing of the existing labeled network behavior record set into normal and abnormal types, wherein the abnormality comprises M types of attack types, randomly extracting a part of samples from the normal and abnormal samples respectively, and adopting a distributed network ensemble learning algorithm to obtain a result containing K1Performing T on network topology of individual nodes1Performing round training to generate a classifier network system of the anomaly detection learning machine;
(3) setting the normal type sample in X' as the source domain sample set XSThe number of samples is m, and the samples of abnormal types are taken as a target domain sample set XT,XTTaking samples of abnormal types with lower detection rate as target domain sample set X to be guidedT1The number of samples is n1And X isSThe average is m/n1Parts, expressed as: <math><mrow><msup><mi>X</mi><mi>S</mi></msup><mo>=</mo><msubsup><mi>X</mi><mn>1</mn><mi>S</mi></msubsup><mo>&cup;</mo><msubsup><mi>X</mi><mn>2</mn><mi>S</mi></msubsup><mo>&cup;</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>&cup;</mo><msubsup><mi>X</mi><mrow><mo>[</mo><mi>m</mi><mo>/</mo><msub><mi>n</mi><mn>1</mn></msub><mo>]</mo></mrow><mi>S</mi></msubsup><mo>,</mo></mrow></math> wherein[·]For rounding operation, X isi SAnd XT1Combined as a training set Ti 1(i=1,2,...,[m/n1]) Adjusting sample weight by adopting a method for adjusting sample weight in the training process of AdaBoost algorithm, and selecting a source domain sample subset X with larger weightsub S
(4) Subset of source domain samples Xsub SUsing samples of other types of the target domain as training samples, adjusting the sample weight by using the method for adjusting the sample weight in the training process of the AdaBoost algorithm again, and adjusting the sample weight from Xsub SRemoving the source domain samples with larger weight from the sample list, and dividing Xsub SThe remaining samples constitute a source domain migration sample set TRD
(5) From a target domain sample set XTIn the random sampling of a part of samples to form a target domain sample subset TRSWith the selected source domain migration sample set TRDTaking TR as training sampleDAssigning the same label as the target domain to be guided sample, the layout containing K2Network topology of individual nodes, input sampling rate ρ2Training round number T2Will TRSAnd TRDDistributed on each node to generate a training sample set S on each nodek,k=1,2,...,K2And generating an abnormal behavior analysis learning machine according to the training sample set by the following steps:
5a) initializing each node training sample set SkThe weight of the middle sample;
5b) training sample set S for each nodekCarrying out weighting sampling with the return to obtain a training subset of each node, carrying out training in a learning algorithm of each node to obtain a base classifier C of each nodek,t 2The base classifier of each node is used to pair SkClassifying, wherein t is the number of current training rounds;
5c) according to the pair SkCalculating the weighted error rate epsilon of the target domain samples on each node according to the classification resultk,tAnd according to epsilonk,tCalculating the weight alpha of each base classifierk,t 2
5d) Updating the weights of the source domain migration sample and the target domain sample when T < T2When T is T, go to step 5b2Then finish training and get the result of all base classifiers Ck,t 2(k=1,2,...,K2,t=1,2,...,T2) A classifier network system of the abnormal behavior analysis learning machine is formed;
(6) inputting a new network behavior record x ', and carrying out quantization and normalization preprocessing on the new network behavior record x ' to obtain a preprocessed network behavior record result x ';
(7) inputting x' into the classifier network system of the anomaly detection learning machine generated in the step 2 for classification to obtain a classification result:
<math><mrow><msub><mi>H</mi><mn>1</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>sign</mi><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>1</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>1</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein h isk,t 1(x '") is the result of the classification of x'" by each base classifier in the anomaly detection learning machine, αk,t 1For the weight of each base classifier, p is the neighbor node index of node k, when H1(x' ") is 1, it indicates that it is of a normal type, and the detection process is ended without any processing; when H is present1If (x') is-1, indicating that the type is abnormal, the procedure goes to step (8);
(8) inputting x' into the classifier network system of the abnormal behavior analysis learning machine generated in the step 5 for classification, and obtaining a classification result:
<math><mrow><msub><mi>H</mi><mn>2</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><munder><mrow><mi>arg</mi><mi>max</mi></mrow><mrow><mi>y</mi><mo>&Element;</mo><mi>Y</mi></mrow></munder><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>2</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>2</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mrow><mo>[</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo></mrow><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mrow><mo>[</mo><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein h isk,t 2(x '") is the result of classifying x'" by each base classifier in the abnormal behavior analysis learning machine, and <math><mrow><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>&Element;</mo><mi>Y</mi><mo>,</mo></mrow></math> where Y ═ 1, 2, ·, M, 1, 2,. and M are the index numbers of M attack types, I [ ·, respectively]For indicating the function, its value is 0 or 1, H2(x″′)∈Y;
(9) H is to be2(x') as an index number, searching the attack type corresponding to the index number, and outputting the attack type as a final detection result.
Compared with the prior art, the invention has the following advantages:
1) because the invention introduces the transfer learning, the existing other labeled data can be utilized to guide the learning of the attack type with lower detection rate, and the data does not need to be collected again;
2) according to the invention, as a method of adjusting the sample weight by adopting an AdaBoost algorithm is adopted, a migration sample with guiding significance on an attack type with lower detection rate can be selected;
3) according to the invention, as a distributed network integrated learning algorithm introducing transfer learning is adopted, the generated abnormal behavior analysis learning machine has higher detection rate to the attack type with lower original detection rate;
4) the invention adopts the distributed network integrated learning algorithm, and the generated anomaly detection learning machine has higher anomaly detection precision;
the invention is a network-based intrusion detection system, which can be used in various complex network environments. Simulation results show that for a standard large-scale network intrusion detection data set KDD CUP' 99, the detection rate of the intrusion detection method based on distributed migration network learning to the R2L attack type can be improved by about 87.3% compared with the intrusion detection method based on distributed migration network learning.
Drawings
FIG. 1 is a schematic diagram of an intrusion detection system based on distributed migration network learning according to the present invention;
FIG. 2 is a flowchart of an intrusion detection method based on distributed migration network learning according to the present invention;
FIG. 3 is a flow chart of pre-selection of source domain migration samples in the present invention;
fig. 4 is a flowchart of the learning machine for generating abnormal behavior analysis according to the present invention.
Detailed Description
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
Referring to fig. 1, the intrusion detection system based on distributed migration network learning of the present invention mainly includes: the system comprises a network behavior record preprocessing module, an abnormality detection module and an abnormal behavior analysis module. Wherein:
the network behavior record preprocessing module comprises: an existing record preprocessing submodule and a new record preprocessing submodule. The existing record preprocessing submodule is used for completing quantization and normalization processing on an existing labeled network behavior record set and transmitting parameters after quantization and normalization processing into the new record preprocessing submodule; the new record preprocessing submodule completes quantization and normalization processing on the new record by using the parameters transmitted by the existing record preprocessing submodule.
The abnormality detection module includes: an anomaly detection learning sub-module and an anomaly detection testing sub-module. Dividing the pre-processed existing labeled network behavior record set into a normal type and an abnormal type, respectively and randomly extracting partial samples from the pre-processed existing labeled network behavior record set, learning by adopting a distributed network integration learning algorithm to generate an abnormal detection learning machine, and transmitting the learning machine to an abnormal detection testing submodule; and the anomaly detection testing sub-module adopts an anomaly detection learning machine to classify and identify the input preprocessed new record, if the output result is normal, the input new record is not processed, the detection is finished, and if the output result is normal, the record is transmitted to the anomaly behavior analysis module.
The abnormal behavior analysis module comprises: a migration sample pre-selection submodule, an abnormal behavior analysis learning submodule and an abnormal behavior analysis testing submodule. The migration sample pre-selection submodule sets a source domain sample and a target domain labeled sample for the existing labeled network behavior record, performs pre-selection on the source domain sample according to the target domain to-be-guided sample, and inputs the selected source domain migration sample into the abnormal behavior analysis learning submodule; the abnormal behavior analysis learning submodule uses the input source domain migration sample and the target domain labeled sample as training samples together, and learns by adopting a distributed network integrated learning algorithm introduced with migration learning to generate an abnormal behavior analysis learning machine; and the abnormal behavior analysis and test sub-module classifies and identifies the input abnormal behavior by adopting an abnormal behavior analysis learning machine and outputs the attack type of the input abnormal behavior.
The intrusion detection system of the invention carries out quantization and normalization preprocessing on the existing network behavior record and records the preprocessed parameters; dividing the pre-processed existing records into a normal record and an abnormal record, respectively extracting partial data from the normal record and the abnormal record to be used as training samples, and learning by adopting a distributed network integration learning algorithm to obtain an abnormal detection learning machine; taking the normal record as a source domain sample, taking other abnormal records as a target domain sample, adjusting the sample weight through an AdaBoost algorithm, and pre-selecting the source domain migration sample according to the sample weight; respectively randomly extracting a part of samples from the preprocessed recorded four types of abnormal data, taking the part of samples and the pre-selected source domain migration samples as training samples, learning by adopting a distributed network integration algorithm introduced with migration learning, and generating an abnormal behavior analysis learning machine; when a new network behavior record is input, the network behavior record is subjected to quantization and normalization preprocessing according to parameters obtained by preprocessing the existing network behavior record, then the preprocessed result is input to an abnormal detection learning machine for testing, if the detection is normal, the preprocessing is not performed, the detection is finished, otherwise, the preprocessed result is input to an abnormal behavior analysis learning machine for testing, and finally the attack type of the abnormal behavior record is output.
Referring to fig. 2, the intrusion detection method of the present invention includes the following steps:
step 1: inputting the existing labeled network behavior record set X, and carrying out quantization and normalization pretreatment on the data set to obtain a pretreated result X'.
The specific process of pretreatment is as follows:
1a) for the attribute with the attribute value in X as the character string, counting the type and quantizing to obtain the quantized result X1
1b) To X1The value range of the attribute value of the source address byte number and the target address byte number in the attribute is determined as 0, 1.3 multiplied by 109]Two attribute values are log10 (-) transformed to range to [0.0, 9.14 ]]Obtaining a transformed result X2
1c) For the transformed result X2The following normalization was performed:
suppose X2Comprising n samples, each sample having d-dimensional features, representing d-dimensional feature vectors of the n samples as X2=[F1,F2,...,Fd]And expressing the ith dimension feature vector as Fi=[fi1,fi2,...,fin]All features are normalized as follows:
f′ij=fij/max(Fi),i=1,2,...,d,j=1,2,...,n
f′i=[f′i1,f′i2,...,f′in],i=1,2,...,d
X′=[F′1,F′2,...,F′d]
the result X' after pretreatment was obtained.
Step 2: dividing the result X' after the preprocessing of the existing labeled network behavior record set into normal and abnormal types, wherein the abnormality comprises M types of attack types, randomly extracting a part of samples from the normal and abnormal samples respectively as training samples, and adopting a distributed network ensemble learning algorithm to perform K-contained test on the samples containing K1Performing T on network topology of individual nodes1And performing round training to generate a classifier network system of the anomaly detection learning machine.
And step 3: setting other types of labeled samples existing in X' as the source domain sample set XSThe number of samples is m, and the samples of abnormal types are taken as a target domain sample set XT,XTTaking samples of abnormal types with lower detection rate as target domain sample set X to be guidedT1The number of samples is n1According to XSAnd XT1Selecting a source domain sample subset X with a larger sample weight by adopting a method for adjusting the sample weight in the training process of the AdaBoost algorithmsub S
Selecting the subset X of the source domain samples with the larger sample weightsub SThe selection process is shown in fig. 3(a), and the specific steps are as follows:
3a) input source domain sample set XSSample set X of target domainTLabeled sample set X of the type in which the target domain is to be guidedT1Sample weight threshold W1
3b) Since n is1M, set to XSIs marked with a labelIs +1, XT1Is-1, and to balance the two types of samples, X isSThe average is m/n1Is prepared from <math><mrow><msup><mi>X</mi><mi>S</mi></msup><mo>=</mo><msubsup><mi>X</mi><mn>1</mn><mi>S</mi></msubsup><mo>&cup;</mo><msubsup><mi>X</mi><mn>2</mn><mi>S</mi></msubsup><mo>&cup;</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>&cup;</mo><msubsup><mi>X</mi><mrow><mo>[</mo><mi>m</mi><mo>/</mo><msub><mi>n</mi><mn>1</mn></msub><mo>]</mo></mrow><mi>S</mi></msubsup><mo>,</mo></mrow></math> Wherein [. ]]For rounding operations, a training set is formed <math><mrow><msubsup><mi>T</mi><mi>i</mi><mn>1</mn></msubsup><mo>=</mo><msubsup><mi>X</mi><mi>i</mi><mi>S</mi></msubsup><mo>&cup;</mo><msup><mi>X</mi><mrow><mi>T</mi><mn>1</mn></mrow></msup><mrow><mo>(</mo><mi>i</mi><mo>=</mo><mn>1,2</mn><mo>,</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>,</mo><mrow><mo>[</mo><mi>m</mi><mo>/</mo><msub><mi>n</mi><mn>1</mn></msub><mo>]</mo></mrow><mo>)</mo></mrow><mo>;</mo></mrow></math>
3c) Will Ti 1(i=1,2,...,[m/n1]) Inputting into AdaBoost algorithm for training, and respectively training from T after multiple rounds of trainingi 1(i=1,2,...,[m/n1]) The weight of the selected sample is larger than the threshold value W1And belong to Xi S(i=1,2,...,[m/n1]) Of (2) constituting a source domain sample subset Xsub S
And 4, step 4: subset of source domain samples Xsub SUsing samples of other types of the target domain as training samples, adjusting the sample weight by using the method for adjusting the sample weight in the training process of the AdaBoost algorithm again, and adjusting the sample weight from Xsub SRemoving the source domain samples with larger weight from the sample list, and dividing Xsub SThe samples in the remainder constitute the source domain migration sample set TRD
From X abovesub SRemoving the source domain samples with larger weightThe flow of (a) is shown in fig. 3(b), and the specific steps are as follows:
4a) sample set X of target domainTDividing sample set X of type to be guidedT1In addition, the sample set for other exception types is denoted XT2Setting Xsub SIs +1, XT2Is labeled-1, input sample weight threshold W2
4b) Mixing Xsub SAnd XT2Form a training set T2Inputting the training data into an AdaBoost algorithm for training to adjust the sample weight, and removing the sample weight in a training set from being larger than a threshold value W after the multi-round training is finished2And belong to Xsub SA sample of (2), A1sub SThe remaining samples constitute a source domain migration sample set TRD
And 5: from the target domain sample XTIn the random sampling of a part of samples to form a target domain sample subset TRSWith the selected source domain migration sample set TRDTaking TR as training sampleDGiving a label the same as a target domain sample to be guided, training by adopting a distributed network integrated learning algorithm introduced with transfer learning, and generating an abnormal behavior analysis learning machine, wherein the specific process is as shown in fig. 4:
5a) the input contains K2Network topology structure of each node, sampling rate rho and training round number T2And combining the target domain sample set TRSAnd source domain migration sample set TRDDistributed on each node to generate a training sample set S on each nodek,k=1,2,...,K2
5b) Initialize the weight D of the ith sample on each node kk,1(xi)=1/lkWherein l iskFor training set S on node kkThe number of samples contained in (i) 1, 2k,k=1,2,...,K2
5c) From the training set S of nodes using weighted sampling with putting backkMiddle samplingAnd obtaining a training subset of nodes, wherein the number of samples is lkρ,k=1,2,...,K2T is the current number of training rounds;
5d) training a base classifier C at node k according to each node training subsetk,t 2And using a base classifier Ck,t 2For training set S on the nodekClassifying;
5e) according to the pair SkCalculating a weighted error rate epsilon of the target domain samplesk,t
<math><mrow><msub><mi>&epsiv;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mo>=</mo><munder><mi>&Sigma;</mi><mrow><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>S</mi></msub></mrow></munder><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mi>I</mi><mrow><mo>[</mo><mi>y</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&NotEqual;</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>]</mo></mrow><mo>,</mo><mi>k</mi><mo>=</mo><mn>1,2</mn><mo>,</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>,</mo><msub><mi>K</mi><mn>2</mn></msub><mo>;</mo></mrow></math>
Wherein h isk,t 2(xi) Is the base classifier Ck,t 2For sample xiAs a result of the classification of (a), <math><mrow><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&Element;</mo><mi>Y</mi><mo>,</mo></mrow></math> y ═ 1, 2,. multidot.m, M is, respectivelyIndex number of M attack types, y (x)i) Is a sample xiA known tag of (a);
5f) compute basis classifier Ck,t 2Weight of alphak,t 2
<math><mrow><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mo>=</mo><mn>0.5</mn><mo>&times;</mo><mi>log</mi><mrow><mo>(</mo><mfrac><mrow><mn>1</mn><mo>-</mo><msub><mi>&epsiv;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mrow><msub><mi>&epsiv;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mfrac><mo>)</mo></mrow><mo>,</mo><mi>k</mi><mo>=</mo><mn>1,2</mn><mo>,</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>,</mo><msub><mi>K</mi><mn>2</mn></msub><mo>;</mo></mrow></math>
5g) Calculating a weight update parameter of the target domain samples as <math><mrow><msub><mi>&beta;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mo>=</mo><mfrac><msub><mi>&epsiv;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mn>1</mn><mo>-</mo><msub><mi>&epsiv;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mrow></mfrac></mrow></math> Weight update parameters for source domain migration samples <math><mrow><msub><mi>&gamma;</mi><mi>k</mi></msub><mo>=</mo><mfrac><mn>1</mn><mrow><mn>1</mn><mo>+</mo><msqrt><mn>2</mn><mi>ln</mi><mrow><mo>(</mo><msub><mi>m</mi><mi>k</mi></msub><mo>/</mo><msub><mi>T</mi><mn>2</mn></msub><mo>)</mo></mrow></msqrt></mrow></mfrac><mo>,</mo></mrow></math> Wherein m iskMigrating the number of samples for the source domain at node k;
5h) updating sample x at node kiWeight D ofk,t(xi) Get the updated weight Dk,t+1(xi):
<math><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi><mo>+</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>=</mo><mfenced open='{' close=''><mtable><mtr><mtd><mfrac><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&beta;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mfrac><mo>,</mo></mtd><mtd><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>S</mi></msub></mtd></mtr><mtr><mtd><mfrac><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&gamma;</mi><mi>k</mi></msub><mrow><mo>-</mo><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mfrac><mo>,</mo></mtd><mtd><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>D</mi></msub></mtd></mtr></mtable></mfenced></mrow></math>
Wherein x isi∈Sk∩TRSDenotes xiBelong to SkTarget domain samples of (1), xi∈Sk∩TRDDenotes xiBelong to SkThe source domain in (1) migrates the sample, and
<math><mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mo>=</mo><munder><mi>&Sigma;</mi><mrow><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>S</mi></msub></mrow></munder><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><mi>i</mi><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&beta;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup><mo>+</mo><munder><mi>&Sigma;</mi><mrow><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>D</mi></msub></mrow></munder><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><mi>i</mi><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&gamma;</mi><mi>k</mi></msub><mrow><mo>-</mo><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow></math>
<math><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>=</mo><mo>-</mo><mn>2</mn><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>y</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&NotEqual;</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mn>1</mn><mo>/</mo><mn>2</mn><mo>)</mo></mrow><mo>-</mo><mn>2</mn><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>y</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&NotEqual;</mo><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mn>1</mn><mo>/</mo><mn>2</mn><mo>)</mo></mrow></mrow></math>
wherein p is the label of the neighbor node of the node k;
5i) when T < T2When T is equal to T, go to step 5c2Then finish training and get the result of all base classifiers Ck,t 2,k=1,2,...,K2,t=1,2,...,T2And forming a classifier network system of the abnormal behavior analysis learning machine.
Step 6: a new network behavior record x "is entered and preprocessed to obtain a preprocessed result x'".
6a) Quantizing the attribute value which is the attribute of the character string according to the method in the step 1a), wherein the result after quantization is x ″1
6b) Respectively carrying out log10 (-) transformation on the attribute values of source address byte number and destination address byte number to obtain a transformed result x ″2
6c) Will record x ″)2D-dimensional feature of (a) is denoted as x ″)2={f″1,f″2,...,f″dAnd for x ″)2The results x' "after pretreatment were obtained by performing the following normalization:
<math><mrow><msubsup><mi>f</mi><mi>i</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msubsup><mo>=</mo><mtable></mtable><mfenced open='{' close=''><mtable><mtr><mtd><msubsup><mi>f</mi><mi>i</mi><mrow><mo>&prime;</mo><mo>&prime;</mo></mrow></msubsup><mo>/</mo><mi>max</mi><mrow><mo>(</mo><msub><mi>F</mi><mi>i</mi></msub><mo>)</mo></mrow></mtd><mtd><msubsup><mi>f</mi><mi>i</mi><mrow><mo>&prime;</mo><mo>&prime;</mo></mrow></msubsup><mo>&le;</mo><mi>max</mi><mrow><mo>(</mo><msub><mi>F</mi><mi>i</mi></msub><mo>)</mo></mrow></mtd></mtr><mtr><mtd><mn>1</mn><mo>,</mo></mtd><mtd><msubsup><mi>f</mi><mi>i</mi><mrow><mo>&prime;</mo><mo>&prime;</mo></mrow></msubsup><mo>></mo><mi>max</mi><mrow><mo>(</mo><msub><mi>F</mi><mi>i</mi></msub><mo>)</mo></mrow></mtd></mtr></mtable></mfenced><mo>,</mo><mi>i</mi><mo>=</mo><mn>1,2</mn><mo>,</mo><mo>.</mo><mo>.</mo><mo>.</mo><mo>,</mo><mi>d</mi></mrow></math>
x″′={f″′1,f″′2,...,f″′d}。
and 7: inputting x' into the classifier network system of the anomaly detection learning machine generated in the step 2 for classification to obtain a classification result:
<math><mrow><msub><mi>H</mi><mn>1</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>sign</mi><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>1</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>1</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein h isk,t 1(x '") is the result of the classification of x'" by each base classifier in the anomaly detection learning machine, αk,t 1For each base classifier weight, when H1(x' ") is 1, indicating that it is of the normal type, no processing is performed, and the detection process is ended, when H is1If (x' ") is-1, it indicates that the type is abnormal, the process proceeds to step (8).
And 8: inputting x' into the classifier network system of the abnormal behavior analysis learning machine generated in the step 5 for classification, and obtaining a classification result:
<math><mrow><msub><mi>H</mi><mn>2</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><munder><mrow><mi>arg</mi><mi>max</mi></mrow><mrow><mi>y</mi><mo>&Element;</mo><mi>Y</mi></mrow></munder><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>2</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>2</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mrow><mo>[</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo></mrow><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mrow><mo>[</mo><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein h isk,t 2(x '") is the result of classifying x'" by each base classifier in the abnormal behavior analysis learning machine, and <math><mrow><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>&Element;</mo><mi>Y</mi><mo>,</mo></mrow></math> H2(x″′)∈Y。
and step 9: h is to be2(x') as an index number, searching the attack type corresponding to the index number, and outputting the attack type as a final detection result.
The effect of the invention can be further illustrated by the following standard large-scale standard intrusion detection simulation data:
1. simulation conditions
The simulation of the invention runs in Windows XP, SPI, CPU Pentium (R)4, basic frequency 2.4GHZ, and software platform VC + + 6.0. The raw intrusion detection data selected for simulation are derived from a common data set KDD CUP' 99, which comprises a Normal type Normal and four intrusion types: DOS, Probe, R2L, and U2R, each network behavior record containing 41 features. Two sub-data sets of 'kdcccup 99.data.10_ percent' and 'corrected', 'kdcccup 99.data.10_ percent' are selected as training data sets and 'corrected' is selected as a test data set, and the distribution of samples is shown in table 1.
Table 1 data set sample distribution
Figure G2009100230731D00111
2. Simulation result
The specific implementation process of the simulation intrusion detection of the invention is as follows:
(1) inputting a data set 'kddeup 99.data.10_ percent' into an existing network behavior record preprocessing submodule for preprocessing;
(2) dividing data in a preprocessed data set 'kddcup 99.data.10_ percent' into two types, wherein Normal is a Normal type, labels are 1, DOS, Probe, R2L and U2R are abnormal types, labels are-1, respectively extracting 5000 and 10000 samples from Normal and abnormal data, taking the extracted samples as training samples, learning by adopting a distributed network ensemble learning algorithm to obtain an abnormal detection learning machine, using a BA scale-free network with the node number of 20 in simulation, the sampling rate of 0.7, the weight updating parameter of 0.8 and the training round number T of 0110, the base classifier is a kernel matching pursuit learning machine KMPLM;
(3) taking a Normal sample as a source domain sample, wherein the number of the samples is 97278, taking an R2L sample as a target domain sample to be guided, taking DOS, Probe and U2R samples as other types of samples of the target domain, pre-selecting the Normal sample to obtain a source domain migration sample set with the number of the samples 1874, and taking a base classifier of an AdaBoost algorithm in simulation as a kernel matching pursuit learning machine KMPLM;
(4) randomly extracting samples from four abnormal types of data in a preprocessed data set Kddcup99.data.10_ percent according to a certain proportion, wherein the proportion of each type is as follows: DOS 2.5%, Probe 75%, R2L 100%U2R 100% and the source domain migration sample set are used as training samples, wherein the source domain migration samples are endowed with labels with the same type as R2L, a distributed network integrated learning algorithm introducing migration learning is adopted for training to obtain an abnormal behavior analysis learning machine, a BA scale-free network with 20 nodes is selected in simulation, the sampling rate is 0.6, and the number of training rounds T is used as a training sample210, the base classifier is a kernel matching pursuit learning machine KMPLM;
(5) preprocessing the data set 'corrected';
(6) inputting the preprocessed data set 'corrected' into an anomaly detection learning machine for testing, wherein the simulation result is shown in table 2;
TABLE 2 anomaly detection accuracy
Figure G2009100230731D00121
(7) Abnormal data of the 'corrected' data set of the preprocessed data set is input into an abnormal behavior analysis learning machine for testing, and a simulation result is shown in table 3, wherein DTNL represents a distributed network integrated learning algorithm for introducing migration learning.
TABLE 3 abnormal behavior analysis detection accuracy
Figure G2009100230731D00122
As can be seen from Table 2, the simulation has a high detection rate for both normal types and abnormal types by using an abnormal detection learning machine generated by distributed network ensemble learning.
As can be seen from table 3, the abnormal behavior analysis learning machine is generated by the simulation respectively using the distributed network ensemble learning algorithm DNB and the distributed network ensemble learning algorithm DTNL introducing the migration learning, and the detection rate of the distributed network ensemble learning algorithm introducing the migration learning to R2L is improved by about 87.3% compared with the distributed network ensemble learning algorithm, and the detection rates of other abnormal behaviors are not significantly reduced.
The whole intrusion detection process realizes the functions thereof through a computer program to finish the detection of network behaviors.
The embodiment is implemented on the premise of the technical scheme of the invention, and a detailed implementation mode and a specific operation process are given, but the protection scope of the invention is not limited to the embodiment.

Claims (5)

1. An intrusion detection system based on distributed migration network learning, comprising:
the network behavior record preprocessing module comprises an existing record preprocessing submodule and a new record preprocessing submodule; the existing record preprocessing submodule is used for completing quantization and normalization processing on an existing labeled network behavior record set and transmitting parameters after quantization and normalization processing into a new record preprocessing submodule; the new record preprocessing submodule carries out quantization and normalization processing on the new network behavior record by utilizing parameters transmitted by the existing record preprocessing submodule, and transmits the result after quantization and normalization processing to the abnormality detection module;
the abnormality detection module comprises an abnormality detection learning submodule and an abnormality detection testing submodule; the abnormity detection learning submodule divides the preprocessed existing labeled network behavior record set into a normal type and an abnormal type, respectively and randomly extracts partial samples from the normal type and the abnormal type, adopts a distributed network integration learning algorithm to learn, generates an abnormity detection learning machine, and transmits the learning machine to an abnormity detection testing submodule; the anomaly detection testing sub-module adopts an anomaly detection learning machine to classify and identify the input preprocessed new network behavior record, if the output result is normal, the input result is not processed, the detection is finished, and if the output result is not normal, the record is transmitted to an anomaly behavior analysis module;
the abnormal behavior analysis module comprises a migration sample pre-selection sub-module, an abnormal behavior analysis learning sub-module and an abnormal behavior analysis testing sub-module; the migration sample pre-selection submodule sets a source domain sample and a target domain labeled sample for the existing labeled network behavior record, completes pre-selection on the source domain sample according to the target domain to-be-guided sample, and inputs the selected source domain migration sample into an abnormal behavior analysis learning submodule; the abnormal behavior analysis learning submodule takes an input source domain migration sample and a target domain labeled sample as training samples together, and adopts a distributed network integrated learning algorithm introduced with migration learning to learn so as to generate an abnormal behavior analysis learning machine; the abnormal behavior analysis testing submodule classifies and identifies the input abnormal records by adopting an abnormal behavior analysis learning machine and outputs the attack types of the input abnormal records.
2. An intrusion detection method based on distributed migration network learning comprises the following steps;
(1) inputting an existing labeled network behavior record set X, and carrying out quantization and normalization pretreatment on the existing labeled network behavior record set to obtain a pretreated result X';
(2) dividing the result X' after the pre-processing of the existing labeled network behavior record set into normal and normalTwo types of abnormity, wherein the abnormity comprises M types of attack types, a part of samples are respectively and randomly extracted from normal samples and abnormal samples, and a distributed network integrated learning algorithm is adopted to process the samples containing K1Performing T on network topology of individual nodes1Performing round training to generate a classifier network system of the anomaly detection learning machine;
(3) setting the normal type sample in X' as the source domain sample set XSThe number of samples is m, and the samples of abnormal types are taken as a target domain sample set XT,XTTaking samples of abnormal types with lower detection rate as target domain sample set X to be guidedT1The number of samples is n1And X isSThe average is m/n1Parts, expressed as:
Figure FSB00000500085500021
wherein [. ]]For rounding operation, willAnd XT1Combined into a training setSample weight is adjusted by adopting a method for adjusting sample weight in the training process of AdaBoost algorithm, and a source domain sample subset with larger weight is selected
Figure FSB00000500085500024
(4) Subset of source domain samples
Figure FSB00000500085500025
And taking samples of other types of the target domain as training samples, and adjusting the sample weight by using the method for adjusting the sample weight in the training process of the AdaBoost algorithm again, so as to obtain the target domain sample weight
Figure FSB00000500085500026
Removing the source domain samples with larger weight
Figure FSB00000500085500027
The remaining samples constitute a source domain migration sample set TRD
(5) From a target domain sample set XTIn the random sampling of a part of samples to form a target domain sample subset TRSWith the selected source domain migration sample set TRDTaking TR as training sampleDAssigning the same label as the target domain to be guided sample, the layout containing K2Network topology of individual nodes, input sampling rate ρ2Training round number T2Will TRSAnd TRDDistributed on each node to generate a training sample set S on each nodek,k=1,2,…,K2And generating an abnormal behavior analysis learning machine according to the training sample set by the following steps:
5a) initializing each node training sample set SkThe weight of the middle sample;
5b) training sample set S for each nodekPerforming weighted sampling with the returns to obtain a training subset of each node, training in the learning algorithm of each node to obtain a base classifier of each nodePair S with base classifier of each nodekClassifying, wherein t is the number of current training rounds;
5c) according to the pair SkCalculating the weighted error rate epsilon of the target domain samples on each node according to the classification resultk,tAnd according to epsilonk,tCalculating the weight of each base classifier
Figure FSB00000500085500029
5d) Updating the weights of the source domain migration sample and the target domain sample when T < T2When T is T, go to step 5b2Then finish training and obtain the classifier formed by all the base classifiers(k=1,2,…,K2,t=1,2,…,T2) A classifier network system of the abnormal behavior analysis learning machine is formed;
(6) carrying out quantization and normalization processing on the existing network behavior record, and recording the parameters after preprocessing: when a new network behavior record x 'is input, carrying out quantization and normalization pretreatment on the parameters obtained by pretreatment according to the existing network behavior record to obtain a pretreated network behavior record result x';
(7) inputting the x' ″ into a classifier network system of the anomaly detection learning machine generated in the step 2 for classification, and obtaining a classification result:
<math><mrow><msub><mi>H</mi><mn>1</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>sign</mi><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>1</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>1</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>1</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein,for detecting anomaliesThe classification result of each base classifier pair x' ″ in the learning machine,
Figure FSB00000500085500033
for the weight of each base classifier, p is the neighbor node index of node k, when H1When (x') is 1, indicating that the sample belongs to a normal type, carrying out no treatment and ending the detection process; when H is present1If (x' ″) is-1, it indicates that the type is abnormal, then the procedure goes to step (8);
(8) inputting the x' ″ into a classifier network system of the abnormal behavior analysis learning machine generated in the step 5 for classification, and obtaining a classification result:
<math><mrow><msub><mi>H</mi><mn>2</mn></msub><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><munder><mrow><mi>arg</mi><mi> </mi><mi>max</mi></mrow><mrow><mi>y</mi><mo>&Element;</mo><mi>Y</mi></mrow></munder><mrow><mo>(</mo><munderover><mi>&Sigma;</mi><mrow><mi>k</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>K</mi><mn>2</mn></msub></munderover><munderover><mi>&Sigma;</mi><mrow><mi>t</mi><mo>=</mo><mn>1</mn></mrow><msub><mi>T</mi><mn>2</mn></msub></munderover><mrow><mo>(</mo><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mo>[</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo><mo>+</mo><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mi>I</mi><mo>[</mo><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msup><mi>x</mi><mrow><mo>&prime;</mo><mo>&prime;</mo><mo>&prime;</mo></mrow></msup><mo>)</mo></mrow><mo>=</mo><mi>y</mi><mo>]</mo><mo>)</mo></mrow><mo>)</mo></mrow></mrow></math>
wherein,the classification result of each base classifier pair x' ″ in the learning machine is analyzed for abnormal behavior, and
Figure FSB00000500085500036
where Y ═ 1, 2, …, M, 1, 2, …, M are the index numbers of M attack types, I [ · respectively]For indicating the function, its value is 0 or 1, H2(x′″)∈Y;
(9) H is to be2(x') is used as an index number, the attack type corresponding to the index number is searched, and the attack type is output as a final detection result.
3. The method of claim 2, wherein the step 3 of selecting the subset of source domain samples with larger sample weight
Figure FSB00000500085500037
The method comprises the following steps:
3a) setting up
Figure FSB00000500085500038
Is +1, XT1Is labeled-1, input sample weight threshold W1
3b) Will be provided withRespectively inputting into AdaBoost algorithm for training to adjust sample weight, and respectively training after multiple rounds of trainingThe weight of the selected sample is larger than the threshold value W1And belong to
Figure FSB000005000855000311
Constitute a source domain sample subset
Figure FSB000005000855000312
4. The method of claim 2, wherein the slave of step 4
Figure FSB000005000855000313
Removing the source domain samples with larger weight according to the following steps:
4a) sampling the target domain XTIn the division of samples X of the type to be guidedT1In addition, other exception types are denoted XT2Setting upIs +1, XT2Is labeled-1, input sample weight threshold W2
4b) Will be provided withAnd XT2Form a training set T2Inputting the sample into AdaBoost algorithm for training to adjust sample weight, and removing T after multi-round training2The weight of the middle sample is greater than the threshold value W2And belong to
Figure FSB000005000855000316
A sample of (2) A
Figure FSB00000500085500041
The remaining samples constitute a source domain migration sample set TRD
5. The method according to claim 2, wherein the updating of the weights of the source domain migration samples and the target domain samples in step 5d is performed by the following steps:
5a) separately calculating target field samplesWeight update parameter of book
Figure FSB00000500085500042
And weight of source domain migration samples
Updating parameters
Figure FSB00000500085500043
Wherein m iskIs S at node kkThe number of source domain migration samples contained in (a);
5b) updating sample x at node kiWeight D ofk,t(xi) Get the updated weight Dk,t+1(xi):
<math><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi><mo>+</mo><mn>1</mn></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>=</mo><mfenced open='{' close=''><mtable><mtr><mtd><mfrac><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&beta;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mfrac><mo>,</mo></mtd><mtd><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>S</mi></msub></mtd></mtr><mtr><mtd><mfrac><mrow><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&gamma;</mi><mi>k</mi></msub><mrow><mo>-</mo><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub></mfrac><mo>,</mo></mtd><mtd><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>D</mi></msub></mtd></mtr></mtable></mfenced></mrow></math>
Wherein x isi∈Sk∩TRSDenotes xiBelong to SkTarget domain samples of (1), xi∈Sk∩TRDDenotes xiBelong to SkThe source domain in (1) migrates the sample, and
<math><mrow><msub><mi>Z</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mo>=</mo><munder><mi>&Sigma;</mi><mrow><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>S</mi></msub></mrow></munder><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><mi>i</mi><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&beta;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup><mo>+</mo><munder><mi>&Sigma;</mi><mrow><msub><mi>x</mi><mi>i</mi></msub><mo>&Element;</mo><msub><mi>S</mi><mi>k</mi></msub><mo>&cap;</mo><msub><mi>TR</mi><mi>D</mi></msub></mrow></munder><msub><mi>D</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><mi>i</mi><mo>)</mo></mrow><mo>&CenterDot;</mo><msup><msub><mi>&gamma;</mi><mi>k</mi></msub><mrow><mo>-</mo><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow></mrow></msup></mrow></math>
<math><mrow><msub><mi>&lambda;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow></msub><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>=</mo><mo>-</mo><mn>2</mn><msubsup><mi>&alpha;</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>y</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&NotEqual;</mo><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mn>1</mn><mo>/</mo><mn>2</mn><mo>)</mo></mrow><mo>-</mo><mn>2</mn><munder><mi>&Sigma;</mi><mi>p</mi></munder><msubsup><mi>&alpha;</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><mi>I</mi><mrow><mo>(</mo><mi>y</mi><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&NotEqual;</mo><msubsup><mi>h</mi><mrow><mi>p</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>)</mo></mrow><mo>-</mo><mn>1</mn><mo>/</mo><mn>2</mn><mo>)</mo></mrow></mrow></math>
wherein, y (x)i) Is a sample xiIs known as a label for a tag to be used,
Figure FSB00000500085500047
is composed of
Figure FSB00000500085500048
For sample xiIs classified into <math><mrow><msubsup><mi>h</mi><mrow><mi>k</mi><mo>,</mo><mi>t</mi></mrow><mn>2</mn></msubsup><mrow><mo>(</mo><msub><mi>x</mi><mi>i</mi></msub><mo>)</mo></mrow><mo>&Element;</mo><mi>Y</mi><mo>.</mo></mrow></math>
CN2009100230731A 2009-06-26 2009-06-26 Distributed migration network learning-based intrusion detection system and method thereof Expired - Fee Related CN101582813B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100230731A CN101582813B (en) 2009-06-26 2009-06-26 Distributed migration network learning-based intrusion detection system and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100230731A CN101582813B (en) 2009-06-26 2009-06-26 Distributed migration network learning-based intrusion detection system and method thereof

Publications (2)

Publication Number Publication Date
CN101582813A CN101582813A (en) 2009-11-18
CN101582813B true CN101582813B (en) 2011-07-20

Family

ID=41364786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100230731A Expired - Fee Related CN101582813B (en) 2009-06-26 2009-06-26 Distributed migration network learning-based intrusion detection system and method thereof

Country Status (1)

Country Link
CN (1) CN101582813B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109101395A (en) * 2018-07-27 2018-12-28 曙光信息产业(北京)有限公司 A kind of High Performance Computing Cluster application monitoring method and system based on LSTM
CN110113353A (en) * 2019-05-20 2019-08-09 桂林电子科技大学 A kind of intrusion detection method based on CVAE-GAN

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101794396B (en) * 2010-03-25 2012-12-26 西安电子科技大学 System and method for recognizing remote sensing image target based on migration network learning
CN102176698A (en) * 2010-12-20 2011-09-07 北京邮电大学 Method for detecting abnormal behaviors of user based on transfer learning
CN102387135B (en) * 2011-09-29 2015-01-28 北京邮电大学 User identity filtering method and firewall
CN103218431B (en) * 2013-04-10 2016-02-17 金军 A kind ofly can identify the system that info web gathers automatically
CN104008426A (en) * 2014-05-15 2014-08-27 上海交通大学 Distributed computing environment performance predicting method based on integrated learning
CN108604304A (en) * 2016-01-20 2018-09-28 商汤集团有限公司 For adapting the depth model indicated for object from source domain to the method and system of aiming field
CN108154029A (en) * 2017-10-25 2018-06-12 上海观安信息技术股份有限公司 Intrusion detection method, electronic equipment and computer storage media
CN107612938A (en) * 2017-10-27 2018-01-19 朱秋华 A kind of network user's anomaly detection method, device, equipment and storage medium
CN108322445A (en) * 2018-01-02 2018-07-24 华东电力试验研究院有限公司 A kind of network inbreak detection method based on transfer learning and integrated study
CN108040073A (en) * 2018-01-23 2018-05-15 杭州电子科技大学 Malicious attack detection method based on deep learning in information physical traffic system
CN108197670B (en) * 2018-01-31 2021-06-15 国信优易数据股份有限公司 Pseudo label generation model training method and device and pseudo label generation method and device
CN109672666B (en) * 2018-11-23 2021-12-14 北京丁牛科技有限公司 Network attack detection method and device
CN109462610A (en) * 2018-12-24 2019-03-12 哈尔滨工程大学 A kind of network inbreak detection method based on Active Learning and transfer learning
CN109492193B (en) * 2018-12-28 2020-11-27 同济大学 Abnormal network data generation and prediction method based on deep machine learning model
CN109523018B (en) * 2019-01-08 2022-10-18 重庆邮电大学 Image classification method based on deep migration learning
CN110224987B (en) * 2019-05-08 2021-09-17 西安电子科技大学 Method for constructing network intrusion detection model based on transfer learning and detection system
CN110348486A (en) * 2019-06-13 2019-10-18 中国科学院计算机网络信息中心 Based on sampling and feature brief non-equilibrium data collection conversion method and system
CN110365583B (en) * 2019-07-17 2020-05-22 南京航空航天大学 Symbol prediction method and system based on bridge domain transfer learning
CN110399856B (en) * 2019-07-31 2021-09-14 上海商汤临港智能科技有限公司 Feature extraction network training method, image processing method, device and equipment
CN110995459B (en) * 2019-10-12 2021-12-14 平安科技(深圳)有限公司 Abnormal object identification method, device, medium and electronic equipment
CN110880020B (en) * 2019-10-30 2022-10-25 西安交通大学 Self-adaptive trans-regional base station energy consumption model migration and compensation method
CN111131185B (en) * 2019-12-06 2022-12-09 中国电子科技网络信息安全有限公司 CAN bus network anomaly detection method and device based on machine learning
CN111016720A (en) * 2019-12-23 2020-04-17 深圳供电局有限公司 Attack identification method based on K nearest neighbor algorithm and charging device
CN111666979B (en) * 2020-05-13 2023-09-08 北京科技大学 Underwater scene target detection integration method and system for label generation
CN111340144B (en) * 2020-05-15 2020-08-11 支付宝(杭州)信息技术有限公司 Risk sample detection method and device, electronic equipment and storage medium
CN111652297B (en) * 2020-05-25 2021-05-25 哈尔滨市科佳通用机电股份有限公司 Fault picture generation method for image detection model training
CN112153000B (en) * 2020-08-21 2023-04-18 杭州安恒信息技术股份有限公司 Method and device for detecting network flow abnormity, electronic device and storage medium
CN112200254A (en) * 2020-10-16 2021-01-08 鹏城实验室 Network intrusion detection model generation method, detection method and electronic equipment
CN112348202B (en) * 2021-01-05 2021-03-30 博智安全科技股份有限公司 Method for establishing rule model in machine learning
CN115118450B (en) * 2022-05-17 2024-01-05 北京理工大学 Incremental dynamic weight integrated learning intrusion detection method integrating multistage features

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109101395A (en) * 2018-07-27 2018-12-28 曙光信息产业(北京)有限公司 A kind of High Performance Computing Cluster application monitoring method and system based on LSTM
CN110113353A (en) * 2019-05-20 2019-08-09 桂林电子科技大学 A kind of intrusion detection method based on CVAE-GAN
CN110113353B (en) * 2019-05-20 2021-06-22 桂林电子科技大学 Intrusion detection method based on CVAE-GAN

Also Published As

Publication number Publication date
CN101582813A (en) 2009-11-18

Similar Documents

Publication Publication Date Title
CN101582813B (en) Distributed migration network learning-based intrusion detection system and method thereof
Olusola et al. Analysis of KDD’99 intrusion detection dataset for selection of relevance features
CN111259219B (en) Malicious webpage identification model establishment method, malicious webpage identification method and malicious webpage identification system
Huang et al. Deep android malware classification with API-based feature graph
Mohammadpour et al. A mean convolutional layer for intrusion detection system
CN111047173B (en) Community credibility evaluation method based on improved D-S evidence theory
CN113360906A (en) Interpretable graph-embedding-based Android malware automatic detection
Harbola et al. Improved intrusion detection in DDoS applying feature selection using rank & score of attributes in KDD-99 data set
Blanco et al. Applying cost-sensitive classifiers with reinforcement learning to ids
Bhati et al. An ensemble model for network intrusion detection using adaboost, random forest and logistic regression
Thanh et al. An approach to reduce data dimension in building effective network intrusion detection systems
Lasky et al. Machine Learning Based Approach to Recommend MITRE ATT&CK Framework for Software Requirements and Design Specifications
Tarun et al. Exploration of CNN with Node Centred Intrusion Detection Structure Plan for Green Cloud
KR20210142443A (en) Method and system for providing continuous adaptive learning over time for real time attack detection in cyberspace
Yang et al. Learning vector quantization neural network method for network intrusion detection
Nachan et al. Intrusion Detection System: A Survey
Yazdani et al. Intelligent Detection of Intrusion into Databases Using Extended Classifier System.
CN116647374B (en) Network flow intrusion detection method based on big data
Deshpande A Review on Intrusion Detection System using Artificial Intelligence Approach
Janaki et al. Enhancing Intrusion Detection with Advanced Feature Extraction Through Machine Learning and Deep Learning Methods
CN117579324B (en) Intrusion detection method based on gating time convolution network and graph
Ashfaq et al. Efficient rule generation for cost-sensitive misuse detection using genetic algorithms
Abbas IDS feature reduction using two algorithms
Mavaluru Using Machine Learning, An Intrusion Detection and Prevention System for Malicious Crawler Detection in e-Learning Systems
Saravanakumar A real time approach on genetically evolving intrusion detection using neutrosophic logic inference system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110720

Termination date: 20150626

EXPY Termination of patent right or utility model