CN102789593A - Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network - Google Patents

Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network Download PDF

Info

Publication number
CN102789593A
CN102789593A CN2012102067789A CN201210206778A CN102789593A CN 102789593 A CN102789593 A CN 102789593A CN 2012102067789 A CN2012102067789 A CN 2012102067789A CN 201210206778 A CN201210206778 A CN 201210206778A CN 102789593 A CN102789593 A CN 102789593A
Authority
CN
China
Prior art keywords
neuron
vector
subnet
triumph
ghsom
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102067789A
Other languages
Chinese (zh)
Other versions
CN102789593B (en
Inventor
杨雅辉
黄海珍
沈晴霓
吴中海
夏敏
阳时来
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University
Original Assignee
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University filed Critical Peking University
Priority to CN201210206778.9A priority Critical patent/CN102789593B/en
Publication of CN102789593A publication Critical patent/CN102789593A/en
Application granted granted Critical
Publication of CN102789593B publication Critical patent/CN102789593B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an intrusion detection method based on an incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network, and belongs to the technical field of network information safety. The method comprises the following steps: 1), acquiring network data online and inputting the network data to an intrusion detection module; 2), calculating a triumph nerve cell t capable of detecting a current vector quantity x by the intrusion detection module; 3), using t to detect x if t is a covering nerve cell and is of the same kind with t; otherwise, putting an unknown attack type tag on x and adding x into an incremental training set; 4), when t meets the expanding conditions, expanding a virtual nerve cell t' from the lower part of t and then expanding a new SOM (Self-organizing Maps) from t', and using an incremental training set It corresponding to t to carry out training; 5), searching a mature father nerve cell of a newly expanded SOM subnet, and if the mature father nerve cell exceeds the conditions for deleting an immature subnet, then training the immature neural network expanded dynamically again; and 6), judging the occurrence of the intrusion according to a detected result output by the intrusion detection module. The intrusion detection method can be used for timely detecting various intrusion behaviors, in particular to the newly emerging intrusion behaviors.

Description

Intrusion detection method based on increment type GHSOM neural network
Technical field
The present invention relates to a kind of intrusion detection method; Be specifically related to a kind of based on increment type growth type hierarchical self-organizing mapping (Incremental Growing Hierarchical Self-organizing Maps; IGHSOM) intrusion detection method of neural network belongs to the information security of computer network technical field.
Background technology
Along with the continuous expansion of computer network scale and the high speed development of network technology, computer network and daily life link together closely, and network security problem also receives people's attention thereupon.Particularly the frequency of assault in recent years, velocity of propagation, be injured face and destructiveness are all continuing to increase; How to guarantee that personal information is not stolen; And attack or attempt how to resist network-external and internal system, become the important topic that the network security industry is paid close attention to.
Around the network security problem people method of a variety of solutions has been proposed, the fire wall of knowing like people.Fire wall can still lack the active response to the attack of making rapid progress under the network environment through filtering and some unauthorized access to system of access control prevention, and enough safeguard protections can not be provided.Intrusion detection is as a kind of active defense technique; Be replenishing to conventional security mechanism; It can be through the user mode of monitor network, user's operation behavior and the abnormal conditions of system, also or through network packet is analyzed and handled, detect the outside invador's of network system user and system intrusion behavior or attempt; And make real-time response, its introducing improves security of network system further.
Intrusion detection method is divided into abnormality detection and detects with misuse.Abnormality detection is to come through the contrast and the irrelevance of normal mode whether the behavior of predictive user is unusual, comprises based on the method for detecting abnormality of feature selecting, Bayesian network, machine learning, data mining, neural network etc.; It is through pattern match current active and the intrusion model or the intrusion rule that define in advance that misuse detects, and comprises the Method of Misuse Intrusion Detection based on conditional probability, state transition analysis, keyboard monitoring, expert system, model misuse reasoning and the conversion of Petri net state.
Neural network algorithm has self-adaptation, self study, self-organization, fault-tolerance and robustness and can carry out that large-scale parallel calculates and advantages such as Nonlinear Mapping preferably, is highly suitable for protean intrusion detection environment.(self-organizing maps SOM) is a kind of typical method in the neural net method in the self-organization mapping.But, can not change, so select different initial neuron numbers can directly cause the structure and the accuracy of detection of final neural network model because traditional SOM neural network model structure is fixed.And growth type layering self-organization mapping (Growing Hierarchical Self-organizing maps; GHSOM) be a kind of variant of SOM model; Not only can adjust subnet adaptively and expand subnet; And can better embody the complicated social strata relation that possibly exist in the data, solved the shortcoming that the SOM neural network model can not dynamically update preferably.
The GHSOM neural network algorithm is based on batch learning, and promptly hypothesis once can obtain all training samples, and these samples are learnt, reach predetermined study number of times after learning process stop, no longer learn new knowledge.Yet in real intrusion detection network application, attack type emerges in an endless stream, and the training sample of all attack types normally progressively obtains along with the time so comprise, and the internal information of training sample reflection possibly change along with change of time.If algorithm obtains all will training again total data behind the new samples at every turn, will certainly improve the time complexity and the space complexity of algorithm, can not in time find new intrusion behavior.
Summary of the invention
To the problem that exists in the prior art; The object of the present invention is to provide a kind of increment type intrusion detection method; It is on a stable intrusion detection model basis, to survey the limit with the online mode frontier inspection to carry out incremental learning; Be implemented in the testing process intrusion detection model is dynamically updated, thereby can detect various intrusion behaviors more timely, especially emerging intrusion behavior.
Traditional SOM neural network structure is fixed, and can not dynamically change, and some neuron can not be won all the time during training, becomes redundant neuron, and GHSOM has overcome these shortcomings to a certain extent.But the GHSOM neural network algorithm is based on batch learning; Learning process stops after reaching predetermined study number of times; Can not learn new knowledge again, the actual intrusion detection detection network that emerges in an endless stream for attack type seems powerless, if continually total data is trained again; Total data is trained again, will certainly improve the time complexity and the space complexity of algorithm.
To as above problem; The present invention proposes based on increment type growth type hierarchical self-organizing mapping (Incremental Growing Hierarchical Self-organizing Maps; IGHSOM) intrusion detection method of neural network improves the intelligent of intrusion detection model.Groundwork comprises: 1) in testing process, increase similarity and judge, be used for judging whether the detection vector is of the same type with the triumph neuron.2) in testing process, dynamically construct the new training dataset of expanding layer.3) dynamic layer that proposes increment type GHSOM neural network is expanded scheme.4) controlling mechanism of design increment type GHSOM scale of neural network.
For the ease of the discussion of subsequent content, provide earlier as giving a definition:
Definition 1: mapping vector.The triumph vector that drops in the training process on the neuron is called the mapping vector.
Definition 2: the vector of winning.The triumph vector that drops in the testing process on the neuron is called the vector of winning.
Definition 3: cover neuron.If neuron can be covered by an enough little hypersphere zone, i.e. the Euclidean distance that mapping on neuron vector arrives these neuron weights is all less than some expectation values, and this neuron is called the covering neuron so, promptly satisfies following formula:
||i-w t||<ζ (I)
Wherein: i is that neuron t shows any vector in the directive duration set, w tBe the weights of neuron t, ζ learns from else's experience, and to test constant concentrated to guarantee this neuronic mapping vector distribution.The value of ζ is obtained through experiment method among this paper, and value is 0.11.
Definition 4: mature neuron.Be meant the neuron that in the Incremental Learning Algorithm implementation, does not change.Neuron in the GHSOM neural network of all initial trainings all is a mature neuron; Neuron in the sub-network of dynamic increment expansion is a mature neuron, should satisfy following necessary condition simultaneously: 1) mature neuron on the non-imaginary root node must be to cover neuron.2) the imaginary root neuron of mature neuron also is a mature neuron.3) all the non-neurons in the SOM network of mature neuron place all are to cover neuron.4) all neurons in the SOM network of all upper stratas of mature neuron (comprising direct upper strata and indirect upper strata) all are mature neurons.
Technical scheme of the present invention is:
A kind of intrusion detection method based on increment type growth type hierarchical self-organizing map neural network the steps include:
1) initialization.Dynamic increment formula GHSOM neural network model is initialized as the neural network model that the intrusion detection method that uses growth type hierarchical self-organizing map neural network trains.And the dynamic increment formula GHSOM neural network model after the initialization is loaded into intrusion detection & dynamic increment formula study module.
2) collection network data from network, and it is carried out Feature Extraction, generate the detecting pattern vector that neural network can be discerned; The detecting pattern vector of handling well is inputed to intrusion detection dynamic increment formula study module, carry out incremental learning, be implemented in the process of intrusion detection the GHSOM model is dynamically updated, wherein incremental learning (comprising intrusion detection) process is:
A. give x with flow online acquisition Network Based and the current detection pattern vector assignment extracted, select among the ground floor SOM with x apart from the neuron t of minimum as the triumph neuron.
If b. triumph neuron t can be used for detecting neuron, just jump to d.
If c. triumph neuron t is not the leaf neuron, then continue to look for the triumph neuron in the triumph neuron t sublayer, and give t this neuron assignment, return b then; Otherwise triumph neuron t is the leaf neuron, output testing result and the neuronic incremental training collection of adjustment triumph: when triumph neuron t is non-covering neuron and increment set I tBe sky, so I t={ x} ∪ M t, all the other situation all only need I t=I t{ x} jumps to e to ∪.
D. (x is t) with the similarity threshold S of triumph neuron t for the Euclidean distance dis of compute vector x and triumph neuron t tIf dis (x, t)>S t, detecting pattern vector x and triumph neuron t inhomogeneity are described, return c; Otherwise vector x detects successfully, judges whether triumph neuron t is to cover neuron, is then to export testing result and return a, is not then to return c.
When e. if triumph leaf neuron t satisfies the expansion condition; Be that incremental training concentrates vectorial number to reach the multiple of expanding parameter Ex; Then expand out an empty neuron t ', expand out the SOM of 2 * 2 new structures again from empty neuron t ' from neuron t below.Incremental training set I with neuron t tAs the mapping vector set of empty neuron t ', with set I tIn vectorial average as the initial weight of empty neuron t ', as father's neuron, the method for the new SOM subnet of expanding of training is following with empty neuron t ':
1. from set I tIn select a kind of input vector at random, calculate each neuronic Euclidean distance among this input vector and the new SOM that expands, what have minor increment is triumph neuron, neuronic weights in adjustment triumph neuron and the field thereof.
2. after reaching predetermined study number of times, calculate all neuronic average quantization error MQE among this SOM mQuantization error qe with the empty neuron t ' of father t'.If MQE m>=τ 1* qe t', then between maximum neuron of quantization error value and contiguous farthest neuron thereof, inserting delegation or the new neuron of row, its weight vector initial value is the mean value of adjacent map unit weight vector, returns 1.; Otherwise explain that each neuron in the SOM layer all reaches stable, the training process of newly expanding SOM finishes.
F. search ripe father's neuron of new expansion SOM subnet and give t, judge whether neuron t satisfies the immature subnet condition of deletion this neuron assignment.Satisfied then the immature subnets deletion of all of neuron t, and merge the incremental training collection that covers neuron and the neuronic mapping vector of the non-covering of leaf formation neuron t in all deletion subnets, return e.
If g. online testing process stops, algorithm finishes, otherwise returns a.
3) invasion analyzes with processing module to judge current whether the invasion according to intrusion detection module output result.Whether traversal attack type library lookup has the attack type that is complementary, if find then explain that current behavior is an intrusion behavior; If do not find, then need judge that current behavior is normal type or emerging attack type through manual work mark or other mechanism.If it is emerging attack type that current behavior is identified as, then be added into the attack type storehouse to this attack type.
Further, described expansion parameter Ex is 1000, τ 1Be 0.07.
Further, the similarity threshold S of described triumph neuron t tComputing formula is:
Figure BDA00001780476300041
M wherein tBe the neuronic mapping vector set of winning, i is set M tIn any vector, w tBe the neuronic weights of winning.
Further, the immature subnet condition of described deletion is: 1) total number of plies of immature subnet is greater than α 1; 2) and the total number of immature neuron greater than α 2.Wherein α 1 is the layer controlled variable of immature subnet, and α 2 is the neuron number controlled variable of immature subnet.Through experiment, these two controlled variable values are: α 1=3, α 2=15.
Compared with prior art, good effect of the present invention is:
This method can be implemented in and dynamically update the intrusion detection model in the intrusion detection process; Under the situation of not destroying the knowledge of having learnt; Can detect and identify the unknown attack type, and when training data was the same, the detection effect of increment type GHSOM algorithm and traditional GHSOM algorithm was suitable; Further; This method can also be controlled the growth scale of intrusion detection model effectively, not only reduces the space consuming of dynamic increment formula GHSOM algorithm, and the increment GHSOM network after guaranteeing to control still can dynamically update when strengthening maturity.
Description of drawings
Fig. 1 is dynamic increment formula GHSOM learning process figure;
Fig. 2 is a dynamic increment formula GHSOM neural network BP training algorithm process flow diagram;
Fig. 3 is for inserting the neuron synoptic diagram;
Fig. 4 is that layer is expanded the back network structure;
Fig. 5 is dynamic increment formula GHSOM network topology structure figure;
Fig. 6 is the intrusion detection process flow diagram flow chart.
Embodiment
Below in conjunction with accompanying drawing the present invention is explained in further detail:
Intruding detection system of the present invention is made up of two parts: the off-line training of neural network model and based on the online detection of neural network model.System gathers the off-line sample data of known attack type and carries out off-line training as initial training sample data collection from network, obtain beginning online network invasion monitoring behind the intrusion detection model.The off-line training process is used traditional GHSOM neural network BP training algorithm, trains initial neural network model based on the initial training data set.Be implemented in the testing process through operation increment type GHSOM Learning Algorithm in the online testing process GHSOM network model is dynamically updated.Obviously, off-line training just carries out initialization to the intrusion detection model, and increment type GHSOM Learning Algorithm just is based on the core technology of the intruding detection system of neural network.
Below we set forth online detection of increment type GHSOM and learning process with emphasis.The process of brief account intrusion detection afterwards and method.
Online detection of increment type GHSOM and learning process
The incremental learning algorithm is on the GHSOM of maturation model basis, to survey the limit with the online mode frontier inspection to carry out incremental learning, is implemented in the testing process GHSOM model is dynamically updated.The off-line data collection of collecting a collection of known type earlier before algorithm begins uses traditional GHSOM algorithm training to generate initial GHSOM network model as the initial training data set.The online network invasion monitoring of beginning on initial GHSOM model based realizes dynamically updating model through operation increment type GHSOM learning algorithm in testing process.Increment type GHSOM learning process (comprising testing process) is as shown in Figure 1.
Increment type GHSOM neural network algorithm is based on the core technology of the intruding detection system of neural network, and the present invention brings forward a kind of increment type GHSOM neural network intrusion detection method.
Increment type GHSOM neural network BP training algorithm flow process of the present invention is as shown in Figure 2:
1. online acquisition detects vector, and the existing GHSOM model of input.
Extract 39 dimensional feature vectors according to the application flow of real network.A concrete proper vector is as follows:
tcp?unk_tcp?1029?139?4?1?2?2.00?1.00?0?0.00?1.00?1?0.33?3?1.00?1.00?0.00?0.00?0.67?43?0?1?0?43.00?0.00?0.00?0.00?1?0?43.00?0.00?0?0?0.00?0.00?0?0?0.00?0.00?nuke
(such as protocol type TCP, UDP etc.) change into the numerical value form all character features of using, and carry out generating the numeric type detection vector that neural network can be discerned after the normalization.
2. calculate the triumph neuron that can be used for detecting.
Based on dynamic increment formula GHSOM network, adopt top-down mode to expand relation according to the layer of increment type neural network, calculate and find out the triumph neuron that can be used for detecting.
Triumph neuron computes formula is: t = Arg Min k | | x - w k | | , k = 1,2 , . . . , m
Wherein: x representes that any one is detected vector, and t representes to detect the triumph neuron of vector x, and m representes neuronic number in the current SOM subnet, w kThe weight vector of expression neuron k.
For the knowledge that guarantees to have learnt is not destroyed, the neuron that can be used for detecting in the dynamic increment formula GHSOM neural network comprises initial network, expands the leaf neuron of back network, and the covering neuron in the non-leaf layer; Covering neuron both had been present in the leaf neuron, also had been present in the non-leaf neuron.
Whether for the non-leaf neuron that can be used for detecting, it is similar with the triumph neuron to need judgement to detect vector, and the testing process vectorial with the class declaration current detection finishes, and inhomogeneity then continues to look for the triumph neuron of detection vector at triumph neuron subnet.
For the leaf neuron that can be used for detecting; Need to judge whether the triumph neuron is to cover neuron: if cover neuron; Need to judge whether the detection vector is similar with the triumph neuron; Testing process with class declaration current detection vector finishes, and inhomogeneity then need be constructed the neuronic incremental training collection of winning; If be non-covering neuron, so directly construct the neuronic incremental training collection of winning.
3. judge whether the detection vector is similar with the triumph neuron that can be used for detecting.
Judging whether detection is vectorial of the same type with the triumph neuron, in fact is exactly to judge the similarity that detects between vectorial and the neuronic weight vector of triumph.Therefore, the neuronic similarity threshold of need to confirm winning, when the similarity that detects vector and this triumph neuron weight vector less than similarity threshold, it is similar with this triumph neuron that explanation detects vector so, belongs to of the same type.
Detect vector and can regard a bit in the higher dimensional space as; Suppose that similarity is greater than the similarity between the dissimilar vectors between the vector of the same type; The density that in higher dimensional space distributes, shows as between the vector of the same type so is big, and has between the dissimilar vector at interval.Therefore exist the distributed areas of vector of the same type to be covered, form similar regions by the space of a sealing.
Definition 5: similar regions.Similar regions is meant the distributed areas that the high vector of similarity constitutes.For increment type GHSOM neural network model; The mapping vector all has high similarity with this neuron; Therefore with leaf neuron weight vector as the centre of sphere, the mapping vector can be regarded a similar regions to this neuronic maximum Euclidean distance as as the hypersphere of radius formation.The corresponding similar regions of each leaf neuron.
Inference 1: drop in the neuronic similar regions if detect vector, it is similar and of the same type with this neuron to detect vector so, otherwise dissimilar.
Define 6. similarity thresholds.Because the corresponding similar regions of each leaf neuron, so also corresponding similarity threshold, similarity threshold shows as the radius of similar regions, the vectorial neuronic ultimate range of winning to this of mapping just.
If S tBe the similarity threshold on the neuron t, then similarity threshold S tComputing method following:
S t = max | | i - w t | | i ∈ M t - - - ( II )
Wherein: i is any vector in the set of neuron t mapping vector, w tBe the weights of neuron t, M tIt is the mapping vector set of neuron t.
The computing method that detect the Euclidean distance of vector x and neuron t are:
dis(x,t)=‖x-w t‖(III)
To any detection vector x, if do not satisfy dis (x, t)>S t, then detect vector x not in the similar regions of neuron t, dissimilar with neuron t, can think that so vector x does not belong to the type that neuron t characterizes, might be emerging attack type; Otherwise it is similar with neuron t to detect vector x, belongs to of the same type.
4., construct this neuronic incremental training collection so if current triumph neuron is the leaf neuron.
If current triumph neuron is the leaf neuron; Having only the triumph neuron of working as is that the similar just explanation with the triumph neuron of covering neuron and detection vector detects successfully; Just need not construct the neuronic incremental training collection of winning yet; Otherwise all the other situation explain that all detecting vector does not find the successful triumph neuron of detection, and detecting vector so might be emerging attack type, so need collect; Be used for training at the new subnet of expanding, the training set of this new expansion subnet is called the incremental training collection.
Therefore the incremental training collection possibly also possibly need the structure mechanism of a kind of incremental training collection of design from the vector of the mapping on the neuron from detecting vector.The all corresponding incremental training collection of each leaf neuron.
According to definition 3, what the covering neuron reflected in essence is the mapping vector distribution concentration degree on the neuron, has determined whether certain neuron plays the cluster effect, whether the mapping vector needs retraining.
Inference 2: if current neuron is not to cover neuron; So the mapping vector sum on all these neurons win vector all need retraining in the new SOM network of expanding, the triumph vector that mapping vector that promptly need produce training process and testing process produce all add incremental training to and concentrate.
Inference 3: if current neuron is to cover neuron, has only so and need add current neuronic incremental training to the dissimilar triumph vector of this neuron in the testing process and concentrate, expand retraining in the subnet new.
5. judge whether current triumph leaf neuron satisfies the expansion condition.
In the online testing process, the incremental training collection constantly dynamically increases, and after being increased to certain scale, thinks that this incremental training collection can be used to train a ripe relatively new subnet, so dynamically expands the new subnet of one deck.In this dynamic layer expansion process, introduce the notion of expansion condition.
Definition 7: expansion condition.Incremental training collection I on leaf neuron t tNumber
Figure BDA00001780476300081
Reach the integral multiple of setting threshold Ex, then can be with incremental training collection I tAs the initial training collection, training one deck ripe relatively, the better SOM network of cluster.Wherein Ex is the neuronic expansion parameter of any leaf.
In order to distinguish traditional GHSOM model and increment GHSOM model, we introduce the empty neuron corresponding with the expansion condition.
Define 8. empty neurons.Empty neuron is a neuron that plays the guiding detection effect of expanding out in the dynamic layer expansion process, does not possess the function that detects attack type.Its weights can not change and equal the average that father's neuron incremental training is concentrated vector, and its mapping vector set equals the neuronic incremental training collection of father.
When down carrying out dynamic layer from leaf neuron t when expanding, need to expand an empty neuron t ' downwards from t earlier, expand the SOM network of 2 * 2 new structures down from empty neuron t ', expanding method is described below:
With empty neuron t ' is father's neuron, with I tBe the initial training collection, adopt new 2 * 2 structure SOM networks of expanding of traditional GHSOM learning method training one deck.
Tradition GHSOM algorithm can calculate each neuronic quantization error at every turn after reaching predetermined study number of times (predetermined study number of times need preestablish).In order to judge whether neural network should increase new neuron in the Layer layer, algorithm at first calculates parameter τ 1With qe uProduct, qe wherein uBe the neuronic quantization error of current SOM father, if exist certain neuronic quantization error among the current SOM greater than parameter τ 1With qe uProduct, then between the contiguous farthest neuron of error neuron and it, insert a delegation or a row neuron.It is as shown in Figure 3 to insert neuronic process.
Because the new training set of expanding layer is from the neuronic incremental training collection of father in the increment type GHSOM algorithm; And whether the new training set of expanding layer is greater than parameter τ from the neuronic mapping vector set of father according to neuronic quantization error so need in increment type GHSOM algorithm, to judge whether insertion delegation or row in traditional GHSOM algorithm 1With qe t' product, qe wherein t' be the quantization error of empty neuron t ', τ 1Expansion control coefficient in the layer.
Up to new expansion subnet line increment or to add the new SOM neural network of expanding of time explanation stable no longer, then expand layer training and finish.Network structure after layer is expanded is as shown in Figure 4.This incremental learning method has the characteristics of adjustment subnet, is to concentrate the inner link of vector to adjust the SOM network structure adaptively according to incremental training than the SOM learning method of fixed sturcture and the benefit of GHSOM learning method.
6. judge whether the increment type GHSOM network size after dynamically updating satisfies controlled condition.
In the online testing process; Dynamic increment formula GHSOM network can dynamically be expanded the SOM subnet adaptively along with the continuous input of newly-increased attack type; Model structure finally may dynamically be expanded into uncontrollable scale; Cause system crash, therefore need a kind of strategy of design to make the scale of dynamic increment formula GHSOM neural network be effectively controlled, require this controlling mechanism not only to reduce the space consuming of dynamic increment formula GHSOM algorithm; And the increment GHSOM model after guaranteeing to control has more maturity, and still can dynamically update.
Introduce earlier as giving a definition:
Definition 9: ripe GHSOM neural network.The sufficient and necessary condition that increment GHSOM neural network is ripe is that all neurons in the neural network all are mature neurons.Initial GHSOM neural network is the GHSOM neural network of a maturation.
Increment type GHSOM network comprises an a plurality of jejune GHSOM sub-network of on the GHSOM network of the maturation of dynamic growth on the initial GHSOM neural net base and a plurality of neural network in maturation, constantly dynamically expanding out.What ripe increment type GHSOM network reflected in essence is the effective neural network of cluster.What immature subnet reflected in essence is the imperfect subnet of cluster effect.
In detection and dynamic learning process; Initial GHSOM neural network can dynamically be expanded the SOM subnet adaptively along with the continuous input of newly-increased attack type; If the neuron on the leaf neuron node of the neural network of maturation in the new SOM sub-network of expanding all is a mature neuron; Then will expand subnet is incorporated in the ripe GHSOM sub-network; Otherwise, under this mature neuron node, dynamically expand a jejune GHSOM subnet, and this immature subnet is constantly dynamically to expand.Neural network structure finally may dynamically be expanded into uncontrollable scale; Cause system crash; Therefore need a kind of mechanism of design to make the scale of dynamic increment formula GHSOM neural network be effectively controlled; Require this controlling mechanism not only to reduce the space consuming of dynamic increment formula GHSOM algorithm, and the increment GHSOM network after guaranteeing to control still can dynamically update when strengthening maturity.
The basic thought of control method is: 1) satisfying the jejune GHSOM subnet of deletion under the certain condition; 2) the mapping vector of collecting on the subnet neuron of deleting regenerates the incremental training collection; 3) the dynamic layer expanding method that adopts the front to describe is expanded one deck again and more is tending towards ripe SOM subnet.
Inference 4: because the reflection of immature subnet is the bad subnet of cluster effect; Therefore immature subnet is satisfying under the certain condition and can also trained the new SOM subnet of expansion again by deletion; Ripe subnet then will constantly enlarge; To strengthen its degree of ripeness, when keeping original detectability, constantly strengthen its effective detectability.
The fundamental purpose that proposes controlling mechanism is in order to control dynamic increment formula GHSOM scale of neural network, and makes increment type GHSOM network model more ripe, so do not become ripe GHSOM sub-network scale enough greatly the time, just can delete immature subnet.For the condition of the immature subnet of deletion is described, introduce the deletion condition.
Delete immature subnet condition: 1) total number of plies of immature subnet is greater than α 1; 2) and the total number of immature neuron greater than α 2.Wherein α 1 is the layer controlled variable of immature subnet, and α 2 is the neuron number controlled variable of immature subnet.General α 1=3, the α 2=15 of adopting in the experiment.
Because immature subnet all is the subnet of newly expanding out, and in the subnet of newly expanding out the reflection of neuronic mapping vector be the newly-increased attack type in the online testing process.In addition; Because non-covering neuron can be shining upon the subnet that vector sum triumph vector all is used to train new expansion in expanding the subnet process; So the mapping vector repeated with the neuronic mapping vector in sublayer after non-covering neuron was expanded subnet, did not therefore need the mapping vector on the non-covering neuron after the collecting layer is expanded.Therefore, after the immature subnet under certain mature neuron node was all deleted, all covered the incremental training collection of the mapping vector of neuron nodes and all leaf neuron nodes as this mature neuron node in the immature subnet of this deletion of needs collection.
Based on this incremental training collection, according to the method that dynamic layer is expanded, dynamically expand an empty neuron and the new SOM subnet of one deck from mature neuron, wherein empty neuronic weights equal the incremental training of mature neuron and concentrate vectorial average.This SOM subnet is the SOM subnet after the huge GHSOM subnet before deleting is simplified.This subnet only comprises one deck SOM subnet, and network size is effectively controlled.And more abundant owing to training data, comprehensive, the network structure degree of ripeness behind the new outward bound constantly is improved, and detectability will be enhanced.
Increment type GHSOM neural network topology structure
Generally speaking, increment type GHSOM can form at last and be similar to topological structure shown in Figure 5.The white round dot is represented the neuron of (can not be used for detecting) of non-leaf node layer in the initial GHSOM model, and the white round dot in the dashed rectangle is represented empty neuron, and the black round dot representes to cover neuron, and the grey neuron is represented non-covering neuron.The part of enclosed with dashed lines is the GHSOM neural network of current maturation, and its excess-three sub-net is immature subnet.
Than traditional G HSOM neural network structure, dynamic increment formula GHSOM neural network structure has following characteristics:
(1) dynamic increment formula learning algorithm is surveyed the limit with the online mode frontier inspection and is carried out incremental learning on the basis of the GHSOM neural network of initial maturation, is implemented in the testing process GHSOM model is dynamically updated.Therefore increment type GHSOM network structure is with dynamic expansion with delete and cut and dynamic change, and each that dynamically expand out layer structure comprises an empty neuron and a SOM subnet.
(2) in increment type GHSOM neural network dynamic change procedure, network structure all is made up of two parts: GHSOM network and a plurality of jejune GHSOM sub-network of a maturation.
(3) stable, quite good detecting effectiveness that the mature neuron that can be used for detecting that comprises in the ripe GHSOM network generally has; And be convenient to the knowledge analysis that comprises according to it and mark the invasion type corresponding with this mature neuron, in intruding detection system, can ripe GHSOM network be that main models is carried out intrusion detection.
(4) comprised immature, unsettled neuron in the jejune GHSOM sub-network in the dynamic increment formula GHSOM neural network structure.But also possibly be to cover neuron in these neurons, also can be used for detecting that just because immature subnet can be adjusted, therefore, these neuronic detections are provisional, are not easy to type is analyzed and marked to these neurons under certain condition.In intruding detection system, can use jejune GHSOM subnet as provisional auxiliary intrusion detection model.
(5) neuron in the GHSOM network of the maturation in the dynamic increment formula GHSOM neural network structure can and not on-the-fly modify by deletion, but for the subnet of the SOM of the new maturation of expanding out, can it be joined in the ripe GHSOM subnet.Therefore, dynamic increment formula GHSOM neural network structure can be realized the increment expansion gradually.The network structure of the maturation that can dynamically expand so constantly increases new knowledge when keeping existing knowledge.Intrusion detection capability is along with incremental learning has constantly obtained dynamic enhancing.
(6) jejune subnet is deleted when reaching certain scale and is cut; And collect existing training vector and expand study again; Expand a SOM subnet of more simplifying; This mechanism makes that increment GHSOM network size is controlled, and constantly carrying out increment type GHSOM model and can be tending towards ripe gradually along with study.
The intrusion detection process
Network invasion monitoring process based on increment type GHSOM neural network is seen Fig. 6.During online detection, at first by data acquisition and pre-processing module collection network data on flows from Experimental Network; Carry out Feature Extraction through characteristic extracting module then; To detect data at last and input to intrusion detection & dynamic increment formula study module.Core component in the intrusion detection & dynamic increment formula study module is a dynamic increment formula GHSOM neural network model.Be attack if intrusion detection & dynamic increment formula study module is differentiated the current network behavior, then need invade analysis and processing module its further analyzing and processing.
May further comprise the steps:
● data acquisition and pre-processing module
The part of data acquisition of this module is to adopt the simulation of IDS Informer instrument to produce the network attack flow, and user terminal generates the normal discharge of various application simultaneously, carries out the collection of tape label characteristic vector data collection.
To the attack of simulation, every connects extraction 39 dimensional features as shown in table 2.Adopting the normalization method that proper vector is carried out normalization afterwards handles.The principle that normalization is handled is to find out the maximal value of each dimensional feature, then with the respective value of each proper vector divided by this maximal value; If maximal value is 0, then the corresponding value of each proper vector is not made change.After the full feature vector set carried out normalization etc. and handle, the detection data of intrusion detection & dynamic increment formula study module have just been can be used as.
39 dimensional features that table 1 extracts
Figure BDA00001780476300121
● intrusion detection & dynamic increment formula study module
The intrusion detection of this module partly is the core of online intruding detection system, and the increment type GHSOM neural network intrusion detection model of studying based on this paper carries out online detection to the network intrusions behavior.This module depends on dynamic increment formula intrusion detection model, and this model directly influences and detects effect and performance.
The dynamic increment formula study part of this module is vectorial as input with the detecting pattern of online acquisition and extraction; Be mapped on the expansion neuron of increment type GHSOM neural network intrusion detection model the dynamically updating of implementation model up to newly-increased sample through increment type GHSOM algorithm training.
Because having the limit, this online intruding detection system dynamically updates the function that frontier inspection is surveyed; Therefore when upgrading the invasion detection model, should not influence the normal operation and the use of intrusion detection feature; When the intrusion detection model dynamically updated, intrusion detection dynamic increment formula study module was also carrying out online detection so.
● invasion is analyzed and processing module
Invasion is analyzed and the effect of processing module is whether the analyzing and testing result is intrusion behavior.At first travel through the attack type storehouse, carry out pattern match again and look for whether the attack type that is complementary is arranged, if find then explain that current behavior is an intrusion behavior; If do not find, then need judge that current behavior is normal type or emerging attack type through another kind of mechanism.If it is emerging attack type that current behavior is identified as, then be added into the attack type storehouse to this attack type.In this module, be intrusion behavior so long as current behavior is identified as, further handle through next module with regard to needs.
● warning processing module
The function of warning processing module is to notify current intrusion behavior to the network manager through human-computer interaction interface.As long as this module receives intrusion behavior; Obtain time and attack type that the source ip and the port numbers of attacking end, the ip that is attacked end and port numbers, intrusion behavior take place earlier; Then these information exhibitions in the warning information hurdle of human-computer interaction interface, send alarm signal to the network manager simultaneously.
● the I/O processing engine
The function of I/O processing engine module is to realize communicating by letter of intruding detection system and human-computer interaction interface, needs a self-defined cover interaction protocol machine-processed.This module both can receive the control command that the user sends through human-computer interaction interface, also can warning information be sent to human-computer interaction interface and show.Human-computer interaction interface mainly contains parameter configuration, operational controls, dynamic increment formula GHSOM model topology figure demonstration and warning information display field.Parameter configuration comprises network configuration and algorithm parameter configuration; Operational controls comprises the attack type in unlatching, time-out, detection of end process and the sign attack type storehouse; Dynamic increment formula GHSOM model topology figure display field dynamically shows the increment type GHSOM neural network topological diagram that intruding detection system generates; The information that the warning information display field shows comprises attacking holds ip and port numbers, quilt to be attacked end ip and port numbers, attack time of origin and attack type etc.

Claims (11)

1. the intrusion detection method based on increment type GHSOM neural network the steps include:
1) online acquisition network data from network, and generate the detecting pattern vector that neural network can be discerned, input to intrusion detection & dynamic increment formula study module; Wherein, said intrusion detection & dynamic increment formula study module comprises the GHSOM neural network model that an off-line training is good;
2) said intrusion detection & dynamic increment formula study module is given x with current detection pattern vector assignment, adopts top-down mode to expand relation according to the layer of GHSOM neural network model, calculates and find out the triumph neuron t that can be used for detecting vector x;
3) if triumph neuron t covers neuron, and vector x is similar with triumph neuron t, then utilizes this neuron t to detect the testing result of exporting this detection vector x; Otherwise for vector x is stamped the label of unknown attack type and is exported testing result; If being non-covering neuron and incremental training collection, triumph neuron t is sky; Then need all be added into incremental training to the mapping vector set of triumph neuron t and vector x and concentrate, otherwise the incremental training that only needs to be added into vector x triumph neuron t is concentrated;
4) when triumph leaf neuron t satisfies setting expansion condition, expand out an empty neuron t ' from triumph leaf neuron t below and expand out 2 * 2 new structure SOM from empty neuron t ' again, utilize the corresponding incremental training set I of triumph leaf neuron t tTrain; Said empty neuron does not possess the function that detects attack type, and its weights can not change and equal the average that father's neuron incremental training is concentrated vector, and its mapping vector set equals the neuronic incremental training collection of father;
5) search ripe father's neuron of new expansion SOM subnet; Judge whether this mature neuron satisfies the immature subnet condition of deletion; If satisfy the immature subnet condition of deletion, then the immature neural network of dynamically expanding out from this mature neuron is partly deleted and train again;
6) invasion analyzes with processing module to judge current whether the invasion based on the testing result of intrusion detection module output;
Wherein, said covering neuron is: if the mapping on neuron vector to the Euclidean distance of these neuron weights all less than some expectation values, this neuron is called the covering neuron so; The triumph vector that drops in the GHSOM neural network model off-line training process on the neuron is called the mapping vector; If the whole neurons in the subnet of neuron place and whole neurons of upper layer network thereof are the covering neuron, then this neuron is a mature neuron, and the network that is made up of said mature neuron is said ripe neural network; Comprising immature neuronic network is immature subnet.
2. the method for claim 1 is characterized in that judging whether vector x is that similar method is with triumph neuron t:
(x is t) with the similarity threshold S of triumph neuron t for compute vector x and the neuronic Euclidean distance dis that wins tIf dis (x, t)>S t, then judge input vector x and triumph neuron t inhomogeneity, otherwise be similar; Wherein, the computing method of similarity threshold are:
Figure FDA00001780476200021
M tBe the neuronic mapping vector set of winning, i is set M tIn any vector, w tBe the weights of triumph neuron t.
3. the method for claim 1 is characterized in that said expansion condition is that incremental training concentrates vectorial number to surpass the multiple that parameter Ex is expanded in setting.
4. method as claimed in claim 3 is characterized in that training the method for the SOM subnet of new expansion to be: the incremental training set I that triumph leaf neuron t is corresponding tAs the mapping vector set of empty neuron t ', with set I tIn vectorial average as the initial weight of empty neuron t ', with empty neuron t ' as father's neuron, the new SOM subnet of expanding of training.
5. like claim 1 or 4 described methods, it is characterized in that training the method for the SOM subnet of new expansion to be: 1) from set I tIn select a kind of input vector at random, calculate each neuronic Euclidean distance among this input vector and the new SOM that expands, having the minor increment neuron is the triumph neuron, neuronic weights in adjustment triumph neuron and the field thereof; 2) reach predetermined study number of times after, calculate all neuronic average quantization error MQE among this SOM mQuantization error qe with the empty neuron t ' of father t'; If MQE m>=τ 1* qe t', then between maximum neuron of quantization error value and contiguous farthest neuron thereof, inserting delegation or the new neuron of row, its weight vector initial value is the mean value of adjacent map unit weight vector, returns step 1); Otherwise explain that each neuron in the SOM layer all reaches stable, the training process of newly expanding SOM finishes; Wherein, τ 1Be expansion control coefficient in the layer.
6. the method for claim 1 is characterized in that the method for confirming triumph neuron t is: calculate each neuronic Euclidean distance among x and the SOM, the neuron of distance minimum is as the triumph neuron of vector x in this subnet.
7. the method for claim 1, it is characterized in that the immature subnet condition of said deletion is: 1) total number of plies of immature subnet is greater than α 1; 2) the total number of immature neuron is greater than α 2; Wherein α 1 is the layer controlled variable of immature subnet, and α 2 is the neuron number controlled variable of immature subnet.
8. method as claimed in claim 7; It is characterized in that the immature neural network of dynamically expanding out from mature neuron is partly deleted and the method for training again is: at first delete the immature subnet under the mature neuron, and collect that all cover the incremental training collection of the mapping vector of neuron nodes and the non-covering neuron node of leaf as this mature neuron in the immature subnet of this deletion; Dynamically expand an empty neuron and the new SOM subnet of one deck from this mature neuron then, utilize this SOM subnet of this incremental training set pair to train.
9. method as claimed in claim 8 is characterized in that α 1=3, α 2=15.
10. the method for claim 1 is characterized in that the computing formula of calculating said triumph neuron t is:
K=1,2 ..., m; X representes that any one is detected vector, and t representes to detect the triumph neuron of vector x, and m representes neuronic number in the current SOM subnet, w kThe weight vector of expression neuron k.
11. the method for claim 1 is characterized in that said neuron t comprises the neuron in the good GHSOM neural network model of said off-line training, the leaf neuron of expansion back network, and the covering neuron in the non-leaf layer.
CN201210206778.9A 2012-06-18 2012-06-18 Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network Expired - Fee Related CN102789593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210206778.9A CN102789593B (en) 2012-06-18 2012-06-18 Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210206778.9A CN102789593B (en) 2012-06-18 2012-06-18 Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network

Publications (2)

Publication Number Publication Date
CN102789593A true CN102789593A (en) 2012-11-21
CN102789593B CN102789593B (en) 2014-11-26

Family

ID=47154994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210206778.9A Expired - Fee Related CN102789593B (en) 2012-06-18 2012-06-18 Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network

Country Status (1)

Country Link
CN (1) CN102789593B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104702460A (en) * 2013-12-10 2015-06-10 中国科学院沈阳自动化研究所 Method for detecting anomaly of Modbus TCP (transmission control protocol) communication on basis of SVM (support vector machine)
CN106357458A (en) * 2016-10-31 2017-01-25 中国联合网络通信集团有限公司 Network element anomaly detection method and device
CN106384005A (en) * 2016-09-28 2017-02-08 湖南老码信息科技有限责任公司 Incremental neural network model-based depression prediction method and prediction system
CN106407695A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Anxiety disorder prediction method and prediction system based on incremental neural network model
CN106407693A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Hepatitis B prediction method and prediction system based on incremental neural network model
CN106407694A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Neurasthenia prediction method and prediction system based on incremental neural network model
CN106407697A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Chronic fatigue syndrome prediction method and prediction system based on incremental neural network model
CN106446551A (en) * 2016-09-28 2017-02-22 湖南老码信息科技有限责任公司 Incremental neural network model based chronic gastroenteritis prediction method and system
CN106446550A (en) * 2016-09-28 2017-02-22 湖南老码信息科技有限责任公司 Cold prediction method and system based on incremental neutral network model
CN106534224A (en) * 2017-01-23 2017-03-22 余洋 Intelligent network attack detection method and device
CN107066881A (en) * 2016-12-14 2017-08-18 四川长虹电器股份有限公司 Intrusion detection method based on Kohonen neutral nets
CN107154950A (en) * 2017-07-24 2017-09-12 深信服科技股份有限公司 A kind of method and system of log stream abnormality detection
CN107203807A (en) * 2016-03-16 2017-09-26 中国科学院计算技术研究所 The computational methods of neutral net, system and its apparatus
CN107305636A (en) * 2016-04-22 2017-10-31 株式会社日立制作所 Target identification method, Target Identification Unit, terminal device and target identification system
CN108347430A (en) * 2018-01-05 2018-07-31 国网山东省电力公司济宁供电公司 Network invasion monitoring based on deep learning and vulnerability scanning method and device
CN108427967A (en) * 2018-03-13 2018-08-21 范大昭 A kind of real-time imaging clustering method
CN110070060A (en) * 2019-04-26 2019-07-30 天津开发区精诺瀚海数据科技有限公司 A kind of method for diagnosing faults of bearing apparatus
CN110154024A (en) * 2019-05-22 2019-08-23 清华大学 A kind of assembly control method based on shot and long term Memory Neural Networks incremental model
CN110324337A (en) * 2019-07-02 2019-10-11 成都信息工程大学 A kind of in-vehicle network intrusion detection method and system based on capsule neural network
CN110689359A (en) * 2019-09-30 2020-01-14 支付宝(杭州)信息技术有限公司 Method and device for dynamically updating model
CN110807230A (en) * 2019-10-29 2020-02-18 天津大学 Method for optimizing robustness of topology structure of Internet of things through autonomous learning
CN111914082A (en) * 2019-05-08 2020-11-10 天津科技大学 Online knowledge aggregation method based on SOM neural network algorithm
CN112115967A (en) * 2020-08-06 2020-12-22 中山大学 Image increment learning method based on data protection
CN113746853A (en) * 2021-09-08 2021-12-03 程楠楠 Network management method and system based on machine learning
CN113807254A (en) * 2021-09-17 2021-12-17 中国人民解放军国防科技大学 Intelligent clustering method based on hierarchical self-organizing mapping digital signal modulation mode
CN115242556A (en) * 2022-09-22 2022-10-25 中国人民解放军战略支援部队航天工程大学 Network anomaly detection method based on incremental self-encoder
CN116738354A (en) * 2023-08-15 2023-09-12 国网江西省电力有限公司信息通信分公司 Method and system for detecting abnormal behavior of electric power Internet of things terminal

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU172615U1 (en) * 2017-03-13 2017-07-14 Ярослав Викторович Тарасов Denial of Service Low Intensity Attack Detection Device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070300301A1 (en) * 2004-11-26 2007-12-27 Gianluca Cangini Instrusion Detection Method and System, Related Network and Computer Program Product Therefor
CN101901317A (en) * 2010-07-09 2010-12-01 北京大学 Growing hierarchical self-organizing maps (GHSOM)-based intrusion detection method for neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070300301A1 (en) * 2004-11-26 2007-12-27 Gianluca Cangini Instrusion Detection Method and System, Related Network and Computer Program Product Therefor
CN101901317A (en) * 2010-07-09 2010-12-01 北京大学 Growing hierarchical self-organizing maps (GHSOM)-based intrusion detection method for neural network

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DENNIS IPPOLITI ET AL: "An Adaptive Growing Hierarchical Self Organizing Map for Network Intrusion Detection", 《2010 PROCEEDINGS OF 19TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATIONS AND NETWORKS (ICCCN)》 *
杨雅辉等: "基于改进的GHSOM的入侵检测研究", 《通信学报》 *
谭玉琴等: "基于改进的SOM入侵检测研究", 《信息工程大学学报》 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104702460A (en) * 2013-12-10 2015-06-10 中国科学院沈阳自动化研究所 Method for detecting anomaly of Modbus TCP (transmission control protocol) communication on basis of SVM (support vector machine)
CN107203807A (en) * 2016-03-16 2017-09-26 中国科学院计算技术研究所 The computational methods of neutral net, system and its apparatus
CN107203807B (en) * 2016-03-16 2020-10-02 中国科学院计算技术研究所 On-chip cache bandwidth balancing method, system and device of neural network accelerator
CN107305636A (en) * 2016-04-22 2017-10-31 株式会社日立制作所 Target identification method, Target Identification Unit, terminal device and target identification system
CN106407695A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Anxiety disorder prediction method and prediction system based on incremental neural network model
CN106407694A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Neurasthenia prediction method and prediction system based on incremental neural network model
CN106407697A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Chronic fatigue syndrome prediction method and prediction system based on incremental neural network model
CN106446551A (en) * 2016-09-28 2017-02-22 湖南老码信息科技有限责任公司 Incremental neural network model based chronic gastroenteritis prediction method and system
CN106446550A (en) * 2016-09-28 2017-02-22 湖南老码信息科技有限责任公司 Cold prediction method and system based on incremental neutral network model
CN106407693A (en) * 2016-09-28 2017-02-15 湖南老码信息科技有限责任公司 Hepatitis B prediction method and prediction system based on incremental neural network model
CN106384005A (en) * 2016-09-28 2017-02-08 湖南老码信息科技有限责任公司 Incremental neural network model-based depression prediction method and prediction system
CN106357458B (en) * 2016-10-31 2019-08-06 中国联合网络通信集团有限公司 Network element method for detecting abnormality and device
CN106357458A (en) * 2016-10-31 2017-01-25 中国联合网络通信集团有限公司 Network element anomaly detection method and device
CN107066881A (en) * 2016-12-14 2017-08-18 四川长虹电器股份有限公司 Intrusion detection method based on Kohonen neutral nets
CN106534224A (en) * 2017-01-23 2017-03-22 余洋 Intelligent network attack detection method and device
CN106534224B (en) * 2017-01-23 2018-04-20 余洋 Intelligent network attack detection method and device
CN107154950A (en) * 2017-07-24 2017-09-12 深信服科技股份有限公司 A kind of method and system of log stream abnormality detection
CN107154950B (en) * 2017-07-24 2021-05-04 深信服科技股份有限公司 Method and system for detecting log stream abnormity
CN108347430B (en) * 2018-01-05 2021-01-12 国网山东省电力公司济宁供电公司 Network intrusion detection and vulnerability scanning method and device based on deep learning
CN108347430A (en) * 2018-01-05 2018-07-31 国网山东省电力公司济宁供电公司 Network invasion monitoring based on deep learning and vulnerability scanning method and device
CN108427967A (en) * 2018-03-13 2018-08-21 范大昭 A kind of real-time imaging clustering method
CN108427967B (en) * 2018-03-13 2021-08-27 中国人民解放军战略支援部队信息工程大学 Real-time image clustering method
CN110070060A (en) * 2019-04-26 2019-07-30 天津开发区精诺瀚海数据科技有限公司 A kind of method for diagnosing faults of bearing apparatus
CN111914082A (en) * 2019-05-08 2020-11-10 天津科技大学 Online knowledge aggregation method based on SOM neural network algorithm
CN110154024A (en) * 2019-05-22 2019-08-23 清华大学 A kind of assembly control method based on shot and long term Memory Neural Networks incremental model
CN110324337A (en) * 2019-07-02 2019-10-11 成都信息工程大学 A kind of in-vehicle network intrusion detection method and system based on capsule neural network
CN110689359A (en) * 2019-09-30 2020-01-14 支付宝(杭州)信息技术有限公司 Method and device for dynamically updating model
CN110807230B (en) * 2019-10-29 2024-03-12 天津大学 Method for autonomously learning and optimizing topological structure robustness of Internet of things
CN110807230A (en) * 2019-10-29 2020-02-18 天津大学 Method for optimizing robustness of topology structure of Internet of things through autonomous learning
CN112115967A (en) * 2020-08-06 2020-12-22 中山大学 Image increment learning method based on data protection
CN112115967B (en) * 2020-08-06 2023-08-01 中山大学 Image increment learning method based on data protection
CN113746853A (en) * 2021-09-08 2021-12-03 程楠楠 Network management method and system based on machine learning
CN113807254A (en) * 2021-09-17 2021-12-17 中国人民解放军国防科技大学 Intelligent clustering method based on hierarchical self-organizing mapping digital signal modulation mode
CN115242556A (en) * 2022-09-22 2022-10-25 中国人民解放军战略支援部队航天工程大学 Network anomaly detection method based on incremental self-encoder
CN115242556B (en) * 2022-09-22 2022-12-20 中国人民解放军战略支援部队航天工程大学 Network anomaly detection method based on incremental self-encoder
CN116738354A (en) * 2023-08-15 2023-09-12 国网江西省电力有限公司信息通信分公司 Method and system for detecting abnormal behavior of electric power Internet of things terminal
CN116738354B (en) * 2023-08-15 2023-12-08 国网江西省电力有限公司信息通信分公司 Method and system for detecting abnormal behavior of electric power Internet of things terminal

Also Published As

Publication number Publication date
CN102789593B (en) 2014-11-26

Similar Documents

Publication Publication Date Title
CN102789593B (en) Intrusion detection method based on incremental GHSOM (Growing Hierarchical Self-organizing Maps) neural network
CN102647292B (en) Intrusion detecting method based on semi-supervised neural network
Ren et al. Building an effective intrusion detection system by using hybrid data optimization based on machine learning algorithms
Hu et al. Online adaboost-based parameterized methods for dynamic distributed network intrusion detection
CN106604267B (en) A kind of wireless sensor network intrusion detection intelligent method of dynamic self-adapting
CN108520272B (en) Semi-supervised intrusion detection method for improving Cantonese algorithm
CN107846392A (en) A kind of intrusion detection algorithm based on improvement coorinated training ADBN
CN111353153B (en) GEP-CNN-based power grid malicious data injection detection method
CN111598179B (en) Power monitoring system user abnormal behavior analysis method, storage medium and equipment
CN104935600A (en) Mobile ad hoc network intrusion detection method and device based on deep learning
CN112333194B (en) GRU-CNN-based comprehensive energy network security attack detection method
CN110290120B (en) Time sequence evolution network security early warning method of cloud platform
CN101242278A (en) Online recognition method for network multi-step attack intension
Li et al. A new method of identification of complex lithologies and reservoirs: task-driven data mining
CN107465664A (en) Intrusion detection method based on parallel more artificial bee colony algorithms and SVMs
Saraswati et al. High-resolution Self-Organizing Maps for advanced visualization and dimension reduction
CN109813542A (en) The method for diagnosing faults of air-treatment unit based on production confrontation network
Zhang et al. Applying big data analytics into network security: Challenges, techniques and outlooks
CN108446562A (en) Intrusion detection method based on taboo with artificial bee colony bidirectional optimistic support vector machines
Chang et al. Internet of things security detection technology based on grey association decision algorithm
Wei et al. Calibrating network traffic with one-dimensional convolutional neural network with autoencoder and independent recurrent neural network for mobile malware detection
CN111027697A (en) Genetic algorithm packaged feature selection power grid intrusion detection method
CN101901317A (en) Growing hierarchical self-organizing maps (GHSOM)-based intrusion detection method for neural network
CN117240523A (en) Network spoofing account detection method based on structure information principle
Gao et al. The prediction role of hidden markov model in intrusion detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141126

Termination date: 20170618

CF01 Termination of patent right due to non-payment of annual fee