CN106951783A - A kind of Method for Masquerade Intrusion Detection and device based on deep neural network - Google Patents

A kind of Method for Masquerade Intrusion Detection and device based on deep neural network Download PDF

Info

Publication number
CN106951783A
CN106951783A CN201710208301.7A CN201710208301A CN106951783A CN 106951783 A CN106951783 A CN 106951783A CN 201710208301 A CN201710208301 A CN 201710208301A CN 106951783 A CN106951783 A CN 106951783A
Authority
CN
China
Prior art keywords
behavior
user
sequence
data
behavioral data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710208301.7A
Other languages
Chinese (zh)
Other versions
CN106951783B (en
Inventor
刘俊恺
夏飞
王毅
张立强
余伟
吴立斌
张明明
李鹏
季晓凯
蒋铮
王艳青
彭轼
魏桂臣
丁新
丁一新
张利
李萌
黄高攀
汤雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Corp of China SGCC
Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd
Original Assignee
State Grid Corp of China SGCC
Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Corp of China SGCC, Information and Telecommunication Branch of State Grid Jiangsu Electric Power Co Ltd filed Critical State Grid Corp of China SGCC
Priority to CN201710208301.7A priority Critical patent/CN106951783B/en
Publication of CN106951783A publication Critical patent/CN106951783A/en
Application granted granted Critical
Publication of CN106951783B publication Critical patent/CN106951783B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/56Computer malware detection or handling, e.g. anti-virus arrangements
    • G06F21/566Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1416Event detection, e.g. attack signature detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Virology (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

This application discloses the Method for Masquerade Intrusion Detection based on deep neural network, including:Obtain the behavior flow data of at least two users;Initialize hyper parameter collection and parameter set;For each user, collection positive sample and negative sample composition training dataset;Using training dataset, loss function and optimized algorithm, the parameter value of parameters is calculated successively;Inquiry behavioral data corresponding with the behavior sequence to be detected of the user is added to behavior insertion and represented in sequence from the user corresponding behavior insertion expression inquiry table;Represent that behavior insertion sequence carries out convolution algorithm, pond computing and the corresponding computing of shot and long term memory artificial neural network and obtains the second behavior sequence;Calculate the probability that second behavior sequence is normal behaviour sequence;And determining whether the user behavior is camouflage intrusion behavior according to the size of the probability, this method can take into account local strong correlation, Long-Range Dependence and the timing of behavior simultaneously, improve the accuracy of masquerade intrusion detection.

Description

A kind of Method for Masquerade Intrusion Detection and device based on deep neural network
Technical field
The application is related to information data processing technology field, more specifically to a kind of detection method for pretending invasion And device.
Background technology
Camouflage invasion refers to that unauthorized user, into a certain system, accesses modification critical data by the validated user that disguises oneself as Or the intrusion behavior of other illegal operations is performed, the intrusion behavior turns into the safety of cyber-net infrastructure most serious One of threaten.
Method for detecting abnormality being used current Method for Masquerade Intrusion Detection, method for detecting abnormality is specifically based on user behavior more Relevant information masquerade intrusion detection, such method by n-gram models consider adjacent lines be between relevant information, together When consider adjacent behavior and non-conterminous behavior this two classes relevant information, be achieved in the local strong correlation of capturing behavior sequence with And the Long-Range Dependence of behavior.
But above-mentioned method for detecting abnormality can not accurately capture the timing of whole behavior sequence, cause the standard of intrusion detection True rate is relatively low.
The content of the invention
In view of this, the application provides a kind of Method for Masquerade Intrusion Detection and device based on deep neural network, to carry The accuracy of high masquerade intrusion detection.
To achieve these goals, it is proposed that scheme it is as follows:
A kind of Method for Masquerade Intrusion Detection based on deep neural network, methods described includes:
Obtain behavior flow data of at least two users in preset time period;
Hyper parameter collection and parameter set, initialization hyper parameter collection and parameter set are initialized, the hyper parameter collection includes:Behavior is embedding Enter to represent the behavior capacity N of inquiry tablec, behavior insertion represent dimension dc, convolution kernel size lc, the dimension d that exports after convolutionhc, pond Change factor lp, shot and long term Memory Neural Networks hidden layer neuron number dhr;The parameter set includes:The corresponding row of each user Inquiry table V is represented to be embeddedc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, shot and long term memory network t when Carve hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu
For each user, positive sample is gathered from the behavior flow data of the user, from the row of the non-user To gather negative sample, the positive sample and negative sample composition training dataset in flow data;
Using the training dataset, loss function and optimized algorithm, the parameters in the parameter set are calculated successively Parameter value;
Inquiry table V is represented from the corresponding behavior insertion of the usercMiddle inquiry is corresponding with the behavior sequence to be detected of the user Behavioral data, and the behavioral data be added to the user corresponding behavior insertion represented in sequence;
Sequence, which carries out the corresponding convolution algorithm of convolutional neural networks, to be represented to behavior insertion, abstract behavior sequence is obtained Row;
The corresponding pond computing of convolutional neural networks is carried out to the abstract behavior sequence, the first behavior sequence is obtained;
The corresponding computing of shot and long term memory artificial neural network is carried out to first behavior sequence, the second behavior sequence is obtained Row;
Calculate the probability that second behavior sequence is normal behaviour sequence;
Judge whether the probability is more than default probability threshold value, if being more than, it is determined that the user behavior is normal, if small In, it is determined that the user behavior is camouflage intrusion behavior.
A kind of masquerade intrusion detection device based on deep neural network, described device includes:
Acquiring unit, for obtaining behavior flow data of at least two users in preset time period;
Initialization unit, for initializing hyper parameter collection and parameter set, initialization hyper parameter collection and parameter set, the super ginseng Manifold includes:Behavior insertion represents the behavior capacity N of inquiry tablec, behavior insertion represent dimension dc, convolution kernel size lc, after convolution The dimension d of outputhc, pond factor lp, shot and long term Memory Neural Networks hidden layer neuron number dhr;The parameter set includes: Each corresponding behavior insertion of user represents inquiry table Vc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, it is long Short-term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu
Sample collection unit, for for each user, positive sample to be gathered from the behavior flow data of the user, from Negative sample, the positive sample and negative sample composition training dataset are gathered in the behavior flow data of the non-user;
Parameter value calculation unit, for according to the training dataset, loss function and optimized algorithm, calculating successively described The parameter value of parameters in parameter set;
Adding device is inquired about, for representing inquiry table V from the corresponding behavior insertion of the usercMiddle inquiry is treated with the user's The corresponding behavioral data of behavior sequence is detected, and the behavioral data is added to the corresponding behavior insertion of the user and represents sequence In;
Convolution unit, for representing that sequence carries out the corresponding convolution algorithm of convolutional neural networks to behavior insertion, is obtained To abstract behavior sequence;
Pond unit, for carrying out the corresponding pond computing of convolutional neural networks to the abstract behavior sequence, obtains the One behavior sequence;
Shot and long term remembers artificial neural network arithmetic element, for carrying out shot and long term memory people to first behavior sequence The corresponding computing of artificial neural networks, obtains the second behavior sequence;
First computing unit, for calculating the probability that second behavior sequence is normal behaviour sequence;
Identifying unit, for judging whether the probability is more than default probability threshold value, if being more than, it is determined that user's row To be normal, if being less than, it is determined that the user behavior is camouflage intrusion behavior.
It can be seen from above-mentioned technical scheme that, behavior fluxion of at least two users in preset time period is obtained first According to then initializing the hyper parameter related to deep neural network and parameter;For each user positive sample is gathered for it and negative Sample, and loss function is configured, and then utilize optimized algorithm and loss function calculating parameter value;Then it is directed to behavior sequence to be detected Row, obtain its corresponding behavior insertion and represent sequence, and then represent that sequence carries out energy in process of convolution, process of convolution to the insertion The local strong correlation and then progress pondization processing of enough capturing behavior sequences filter out the key character in behavior sequence, and then Carry out being capable of in the corresponding computing of shot and long term memory artificial neural network, the calculation process again the timing of capturing behavior sequence with And the Long-Range Dependence between some behaviors, and then whether it is that user is normal that the behavior sequence obtained by above-mentioned processing is calculated again Whether the probability of behavior sequence, the final behavior determined in user's behavior sequence to be detected is the behavior for pretending invasion.It can be seen that The technical scheme of application can take into account local strong correlation, Long-Range Dependence and the timing of behavior simultaneously, improve camouflage The accuracy of intrusion detection.
Brief description of the drawings
, below will be to embodiment or existing in order to illustrate more clearly of the embodiment of the present application or technical scheme of the prior art The accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this Some embodiments of application, for those of ordinary skill in the art, on the premise of not paying creative work, can be with Other accompanying drawings are obtained according to these accompanying drawings.
Fig. 1 is a kind of Method for Masquerade Intrusion Detection basic procedure based on deep neural network disclosed in the embodiment of the present application Figure;
Fig. 2 is a kind of structural representation of the corresponding operation methods of LSTM disclosed in another embodiment of the application;
Fig. 3 is the disclosed method basic procedure that user behavior flow data is obtained in Pseudo-median filter of the embodiment of the application one Schematic diagram;
Fig. 4 is a kind of camouflage invasion device fundamental block diagram based on deep neural network disclosed in the embodiment of the present application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, the technical scheme in the embodiment of the present application is carried out clear, complete Site preparation is described, it is clear that described embodiment is only some embodiments of the present application, rather than whole embodiments.It is based on Embodiment in the application, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of the application protection.
The embodiment of the present invention provides a kind of Method for Masquerade Intrusion Detection based on deep neural network, as shown in figure 1, the party Method includes:
The behavior flow data of S100, at least two users of acquisition in preset time period;
Wherein it is possible to all validated users are obtained in enterprise in some months, or the behavior flow data in 1 year, wherein going For flow data data source can be system shell-command record, user access file record or mouse action record.This hair Illustrated in bright embodiment by taking history shell-command record as an example, then behavior flow data is then substantially shell-command fluxion According to the shell-command flow data includes shell-command set.
S110, initialization hyper parameter collection and parameter set, the hyper parameter collection include:Behavior insertion represents the behavior of inquiry table Capacity Nc, behavior insertion represent dimension dc, convolution kernel size lc, the dimension d that exports after convolutionhc, pond factor lp, shot and long term note Recall neutral net hidden layer neuron number dhr;The parameter set includes:Each corresponding behavior insertion of user represents inquiry table Vc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, shot and long term memory network t hidden layer and t+1 moment Hidden layer connection weight Whr, probability output layer weight Wu
Specifically, determine the present invention relates to hyper parameter and parameter, and assign one for each hyper parameter and parameter Initial value.
Wherein, according to following empirical equation:Nc=| C |+1, dc=50,dhr=dhc-α、lc=3 and lp =2, initial value is assigned to the hyper parameter that hyper parameter is concentrated, wherein C represents shell-command set, | C | represent command history C's Size, α is dhrRegulation because it is the constant between 1-10.
Wherein, the corresponding behavior insertion of each user represents inquiry table VcIn data be by vector sign.
S120, for each user, positive sample is gathered from the behavior flow data of the user, from the institute of the non-user State and negative sample is gathered in behavior flow data, the positive sample and negative sample composition training dataset;
Wherein, positive sample can be the part behavior flow data gathered out from the behavior flow data of the user, also can be by Whole behavior flow data is used as positive sample.
S130, using the training dataset, loss function and optimized algorithm, each in the parameter set is calculated successively The parameter value of parameter;
Specifically, it is L (θ) to set loss function, the loss function represents deep neural network model in training dataset On negative log-likelihood, wherein,
L (θ)=- logp (D | θ)
In formula, θ={ Vc,Whc,Whi,Whr,Wu, it is the parameter of model;D is training dataset;P (D | θ) represent that parameter is set During for a certain θ values, likelihood of the deep neural network model on training dataset D, wherein,
In formula, Pθ(si) it is a certain behavior sequence siIt is the probability of normal behaviour sequence;For behavior sequence siLabel, Its value is 1 or 0,When represent behavior sequence siFor the normal sequence of user,When represent sequence siFor user Unusual sequences.
Specifically, carrying out hierarchical optimization to the parameter in deep neural network model in an iterative manner using Adam algorithms Update, until parameter convergence.
Wherein, deep neural network model is substantially the mathematical function of a nested multilayer, the nested function it is each Straton function can all be determined by the activation primitive and parameter of corresponding computation layer in network.It is being above-mentioned initialization to network parameter Parameter optimization solve when, by network model substitute into loss function in, the object function now solved be with above-mentioned negative logarithm seemingly Right function as outermost layer functions a nested function.When there is a training sample, training sample is substituted into the nested letter The subfunction of number innermost layer, that is, obtain one using sample as coefficient, and each layer parameter of network is independent variable, and penalty values are dependent variable Objectives function.Then, chain rule successively derivation is used from outside to inside to the objectives function, and each layer parameter is entered Row updates.Wherein, specific update mode is:The step is calculated according to the single order moments estimation and second order moments estimation of each subfunction gradient Renewal amount, the parameter of subfunction is reduced into renewal amount.Often wheel iteration all carries out above-mentioned Optimizing Flow, until parameter convergence Untill.It can make it that parameter is more steady using the algorithm, and have faster convergence rate.
S140, from the user corresponding behavior insertion represent inquiry table VcMiddle inquiry and the behavior sequence to be detected of the user Corresponding behavioral data, and the behavioral data is added in the corresponding behavior insertion expression sequence of the user;
Wherein, the behavior sequence to be detected of the user can be the behavior sequence captured from the behavior flow data of user Arrange or by a behavior sequence for including multiple behavioral datas for user input, and then User behavior is embedded in expression Inquiry table Vc, find the behavioral data characterized by vector corresponding with the behavioral data in behavior sequence.
Preferably, inquiry table V is represented from the corresponding behavior insertion of the usercMiddle inquiry and the behavior sequence to be detected of the user The process of corresponding behavioral data is arranged, including:
The behavioral data in behavior sequence to be detected is obtained successively, and represents inquiry table in the corresponding behavior insertion of the user VcMiddle inquiry behavioral data corresponding with the behavioral data in the behavior sequence to be detected;
If not finding, behavior insertion is represented into inquiry table VcIn be located at the behavioral data conduct of first position Behavioral data corresponding with the behavioral data, or can also subordinate act insertion expression inquiry table VcArbitrarily select a behavior number According to.
Wherein, behavior insertion represents that sequence isWhereinRepresent sequence to be detected In shell-command ckEmbedded expression, i.e., by the order in sequence to be detected in behavior insertion expression inquiry table VcIn find with Its corresponding order characterized by vector.
S150, sequence, which carries out the corresponding convolution algorithm of convolutional neural networks, to be represented to behavior insertion, obtain abstract row For sequence;
By taking the shell-command stream of user as an example, there is many local associations very strong command block in shell-command stream, such as: " cd ls cat ", the command block is that, for checking some file, " gdb gcc make ", the order is for compiling certain part of C soon ++ source code, and for checking file and the compiling behavior such as C++ source codes, when system performs these behaviors, it is necessary to by these Behavior decomposition is more sub- behavior combinations.So can determine that, the behavior of user is realized by the combination of shell-command, should Feature is the reason for many orders have stronger local association in shell-command stream.
Specifically, representing that sequence carries out convolution algorithm to embedded using the convolution kernel of fixed size, to capture embedded represent Strong local association in sequence, and then form new sequence and representTable Show that convolution kernel carries out the local command block formed during the i-th step convolution, the command block can be counted as the abstract behavior of user, its In:
In formula,Represent convolution kernel W during the convolution kernel of convolutional layer, whole convolution algorithmhcIt is shared;Represent convolution kernel WhcBias vector;It is vi,vi+1,...,vi+lcStack the matrix formed; OperatorRepresent that two homotype matrix correspondence rows carry out inner product operation;Line rectification activation primitive is used in formula:F (x)=max (0,x)。
The activation primitive, with the characteristics such as simple, unsaturation are calculated, can solve depth with respect to other nonlinear activation functions The gradient spent in network training process disappears and explosion issues, with faster training speed.This swashs in the training process simultaneously Function living can be self-introduced into openness, prevent that network is excessively fitted training data, lift the generalization ability of network.
S160, the corresponding pond computing of convolutional neural networks is carried out to the abstract behavior sequence, obtain the first behavior sequence Row;
Wherein, pond computing can filter the insignificant feature in abstract behavior sequence, and retain key character, the first row It is for sequenceWherein
In formula,Represent i-th × l in the behavior sequence H after convolution operationpJ-th of value of individual characteristic vector.
S170, the corresponding computing of shot and long term memory artificial neural network is carried out to first behavior sequence, obtain second Behavior sequence;
Wherein, shot and long term memory artificial neural network LSTM is made up of multiple LSTM blocks, a LSTM block, such as Fig. 2 It is shown, including:The cell body of one preservation state, three limit the door that control data is flowed to, and concrete processing procedure is:
1) during the input data at processing current time t, cell body of the door determination from last moment is forgotten by one first Which kind of information is abandoned in state, wherein current time t input data is the behavior in the first behavior sequence positioned at current time Data, specifically:
Forget input data, the block output data of last moment and the cell of last moment that door reads current time Body state, by convergenceAfterwards, the data after convergence are handled using an activation primitive f (z), finally for each cell The component of body state exports the real number value between a 0-1, and 1 expression information is fully retained, and 0 expression information is given up completely, The activation primitive can be sigmoid activation primitives.Wherein, the formula that said process is used is as follows:
In formula,When representing processing t input data, door is forgotten by convergenceThe data obtained afterwards;Represent The input data of t;Represent the block output data at t-1 moment;Represent the cell body state at t-1 moment;wTable Show the weight for forgeing input data in door;wRepresent the weight of last moment block output data in forgetting door;wRepresent to forget The weight of last moment cell body state in door;Represent the output of forgetting door;F (z) is sigmoid activation primitives, and it is:
2) the cell body state that need to be updated is determined secondly by input gate, input gate will read the input number at current time According to, the block output data of last moment and the cell body state of last moment, by convergenceAfterwards, entered using activation primitive Line activating is to determine the cell body state for needing to update, and the formula used during being somebody's turn to do is as follows:
In formula,When representing processing t input data, input gate is by convergenceThe data obtained afterwards;wilRepresent defeated The weight of input data in introduction;whiRepresent the weight of last moment block output data in input gate;wklRepresent in input gate The weight of last moment cell body state;During f (z) is figure for sigmoid activation primitives Represent the defeated of input gate Go out.
3) before cell body state is updated, block will calculate the candidate state of cell body.When updating cell body state, wait Select state under the control of input gate, be added into cell body state, while cell body is in the case where forgeing the control of door, it is selective Forget to use in the state of old cell body, the detailed process of cell body more new state and arrive following formula:
In formula,When representing processing t input data, the data obtained after the convergence of cell body state;The convergence processing By bottom in figureRepresent;wikRepresent weight of the cell body for input data;whkRepresent cell body for last moment area The weight of block output data;Represent the state of t cell body;G (z) is tanh activation primitive by figureTable Show, it is:
Wherein,Below by cellCell state data in convergence, and be somebody's turn to do Two " " computings in formula are inner product operation, respectively in corresponding diagramAfterwardsProcessing andAfterwardsProcessing.
4) by exporting the information that gate control is exported, out gate reads the input data at current time, last moment first Block output data, the cell body state at current time;After convergence, in being figure using sigmoid activation primitivesSwash Data after convergence living, obtain output control vector.Each component of the dominant vector is 0 or 1 real number value, and 1 represents to let pass defeated Go out, 0 represents to block output.Detailed processing of the out gate to data uses equation below:
In formula,When representing block processes current time t input data, the data that out gate is obtained after convergence;w Represent the weight of input data in out gate;wRepresent the weight that last moment block is exported in out gate;wRepresent out gate The weight of middle last moment cell body state;Represent the output of out gate.
5) the cell body state at current time passes through activation primitive h (z) activation, under the control of out gate, is somebody's turn to do The current time t of block outputWherein:
Wherein, h (z) is tanh activation primitive, is:
Finally, all LSTM blocks output at 1-T moment is averagely obtained into rs, wherein:
In formula, the standard maximum length for the first behavior sequence that T obtains for pond, but the current the first row obtained behind pond Standard maximum length and may be unsatisfactory for for the length of sequence, then fills current first behavior sequence with 0 vector, makes its length be Standard maximum length.
S180, calculating second behavior sequence are the probability of normal behaviour sequence;
Wherein, the Probability p that the second behavior sequence is normal behaviour sequence is calculatedθ(s):
In formula, s is shell-command sequence to be detected for behavior sequence to be detected;pθ(s) it is for shell-command sequence The probability of the normal shell-command sequence of user, the probability function is logistic regression function, wherein;
In formula, θ={ Vc,Whc,Whi,Whr,Wu,;wiuRepresent i-th of component value of W;rs∈Rm×1Represent sequence s to be detected Low-dimensional represent, risRepresent rsI-th of component value.
S190, judge whether the probability is more than default probability threshold value, if being more than, it is determined that the user behavior is normal, If being less than, it is determined that the user behavior is camouflage intrusion behavior.
In above-described embodiment, behavior flow data of at least two users in preset time period is obtained first, then initially The change hyper parameter related to deep neural network and parameter;Positive sample and negative sample are gathered for it for each user, and is configured Loss function, and then utilize optimized algorithm and loss function calculating parameter value;Then behavior sequence to be detected is directed to, its is obtained right The behavior insertion answered represents sequence, so represent the insertion that sequence is carried out being capable of capturing behavior in process of convolution, process of convolution The local strong correlation and then progress pondization processing of sequence filter out the key character in behavior sequence, and then carry out length again It is capable of the timing of capturing behavior sequence and some behaviors in the corresponding computing of phase memory artificial neural network, the calculation process Between Long-Range Dependence, whether and then it is user's normal behaviour sequence that the behavior sequence obtained by above-mentioned processing, which is calculated, again Whether probability, the final behavior determined in user's behavior sequence to be detected is the behavior for pretending invasion.It can be seen that the technology of the application Scheme can take into account local strong correlation, Long-Range Dependence and the timing of behavior simultaneously, improve masquerade intrusion detection Accuracy.
In an alternative embodiment of the invention, the process of behavioral data of at least two users in preset time period is obtained, As shown in figure 3, including:
The behavior record data of S300, at least two users of acquisition in preset time period, the behavior record data Record form is:Customer identification number, behavior description, date;
Specifically, original shell-command record data, the form of the data can be obtained by shell history mechanisms For customer identification number, detailed shell-command description on the date, thus form triple object set;
S310, for each user, the behavior record data of the user are temporally ranked up, the row of the user is formed For data object flow;
Wherein, using each customer identification number as one independent group, the ternary group objects of same subscriber identification number is assembled to together Insertion sort is temporally carried out in one group, accumulation process, thus, the triple object data stream of this group of user, i.e. data is formd Object data stream.
S320, data cutout carried out to the behavioral data object data stream of the user according to default sliding time window, obtained Behavioral data object data stream after to interception;
Specifically, to each group of triple object data stream, carrying out sliding time window interception, time window is sized to τt, each sliding step is set to πtIf without ternary group objects in the time window, creating (customer identification number, None, a day Phase) ternary group objects be filled.After sliding time window is intercepted, include the three of multiple short behavior sequences by being formed Tuple object stream, these behavior sequences constitute the behavioral data object data stream after interception.
S330, carry out to the data object flow after the interception extraction of behavior trunk and date and filter, obtain the user's Behavior flow data.
Specifically, carrying out date filtering first to the stream, behavior trunk extraction is then carried out, i.e., to shell-command title Extracted, shell detailed orders are for example following:
>cd./hello
>xguake&fg
>vim hello.txt
>ls–a
After the extraction of behavior trunk, it will as symbol stream:(cd, xguake, fg, vim, ls).Finally, binary is formed Group objects (customer identification number, short behavior main stream), such as (user1, [cd, xguake, fg, vim, ls]), that is, has obtained the use The behavior flow data at family, behavior flow data includes multiple behavior sequences.
Above-described embodiment simplifies initial data and obtains a more letter by the way of behavior trunk is extracted and the date filters Clean behavior flow data.
In an alternative embodiment of the invention, the process of negative sample is gathered from the behavioral data of the non-user, including:
By distributionThe k negative sample of sampling from the behavioral data stream of the non-user, wherein,
In formula, nsThe number of times that the behavior sequence s included for behavior flow data occurs in the positive sample;ns,uRepresent The number of times that behavior s occurs in user u;For user uiBehavior sequence S set,For the non-use The number for the behavior sequence that the behavior flow data at family includes,The row included for the behavior flow data of the user For the number of sequence.
Above-described embodiment, the collection for realizing negative sample is distributed by P, and the acquisition mode is relatively adapted to normal in abnormality detection Behavior and abnormal behaviour have the situation of larger frequency distance, and with the simple advantage of calculating.
In an alternative embodiment of the invention, the process of initiation parameter includes:
1) initialize the corresponding behavior insertion of each user and represent inquiry table Vc, it is specially:
Behavioral data in the behavior flow data of the user is converted to the behavioral data characterized by vector;
Wherein, the corresponding behavioral data characterized by vector of each behavioral data is obtained using Word2vec instruments;
Calculate the reverse document frequency weighted average of each behavioral data characterized by vectorAnd assign it to the row Inquiry table V is represented to be embeddedc, wherein,
In formula,Expression behavior ckInitialization insertion represent;ckKth behavior is concentrated in expression behavior;Represent The behavior obtained using Word2vec instruments, which is embedded in, to be represented;
In formula, idf (ck) represent behavior ckInverse document frequency, measured order ckGeneral importance;U represents user Set;ujRepresent jth user;
Wherein, it is embedded to order to represent inquiry table VcIn 1 to No. N order all initialized as stated above, No. 0 Order carries out random initializtion.Although ordering insertion to represent that all orders in inquiry table Vc all can be initial at random in theory Change, but initialized by what Word2vec instruments were produced by the behavioral data of vector representation, by acceleration parameter learning process.
2) setting weight w, which is obeyed, is uniformly distributed U [a, b], initializes the convolution kernel Whc, pond layer and shot and long term memory net The weight W of networkhi, shot and long term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu, wherein,
According toInitialize convolution kernel Whc, according toJust The weight W of beginningization pond layer and shot and long term memory networkhi, according toInitialize shot and long term memory net Network t hidden layer and t+1 moment hidden layer connection weights Whr, according to w~U (- 0.05,0.05) initialization probability output layer Weight Wu
Wherein, the probability density function for being uniformly distributed U [a, b] is:
By above-mentioned initialization procedure, word will be used in behavior flow data by the processing of the behavior flow data to user The behavioral data that symbol is represented is converted to the behavioral data characterized by vector, and then represents inquiry table V for order is embeddedcAssignment, And be uniformly distributed with obeying as other specification assignment, realize and assign a high-quality initial value for parameter.
The embodiment of the present invention also provides a kind of masquerade intrusion detection device based on deep neural network, as shown in figure 4, should Device includes:
Acquiring unit 400, for obtaining behavior flow data of at least another user in preset time period;
Initialization unit 410, it is described for initializing hyper parameter collection and parameter set, initialization hyper parameter collection and parameter set Hyper parameter collection includes:Behavior insertion represents the behavior capacity N of inquiry tablec, behavior insertion represent dimension dc, convolution kernel size lc, volume The dimension d exported after producthc, pond factor lp, shot and long term Memory Neural Networks hidden layer neuron number dhr;The parameter set bag Include:Each corresponding behavior insertion of user represents inquiry table Vc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, shot and long term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu
Sample collection unit 420, for for each user, positive sample to be gathered from the behavior flow data of the user This, gathers negative sample, the positive sample and negative sample composition training dataset from the behavior flow data of the non-user;
Parameter value calculation unit 430, for according to the training dataset, loss function and optimized algorithm, calculating successively The parameter value of parameters in the parameter set;
Adding device 440 is inquired about, for representing inquiry table V from the corresponding behavior insertion of the usercMiddle inquiry and the user The corresponding behavioral data of behavior sequence to be detected, and the behavioral data be added to the user corresponding behavior insertion represented In sequence;
Convolution unit 450, for representing that sequence carries out the corresponding convolution algorithm of convolutional neural networks to behavior insertion, Obtain abstract behavior sequence;
Pond unit 460, for carrying out the corresponding pond computing of convolutional neural networks to the abstract behavior sequence, is obtained First behavior sequence;
Shot and long term memory artificial neural network arithmetic element 470, for carrying out shot and long term note to first behavior sequence Recall the corresponding computing of artificial neural network, obtain the second behavior sequence;
First computing unit 480, for calculating the probability that second behavior sequence is normal behaviour sequence;
Identifying unit 490, for judging whether the probability is more than default probability threshold value, if being more than, it is determined that the use Family behavior is normal, if being less than, it is determined that the user behavior is camouflage intrusion behavior.
Preferably, the acquiring unit 400 includes:
Subelement is obtained, for obtaining behavior record data of at least two users in preset time period, the behavior The record form of record data is:Customer identification number, behavior description, date;
Sequencing unit, for for each user, the behavior record data of the user to be temporally ranked up, being formed should The behavioral data object data stream of user;
Interception unit, for carrying out data cutout to the behavioral data object data stream according to default sliding time window, Behavioral data object data stream after being intercepted, the behavioral data object data stream includes multiple behavior sequences;
Determining unit, for carrying out the extraction of behavior trunk and date filtering to the behavioral data object data stream after the interception, Obtain the behavior flow data of the user.
Preferably, the initialization unit 410 includes:
First initialization unit, inquiry table V is represented for initializing the corresponding behavior insertion of each userc, including:
Converting unit, the row characterized by vector is converted to for the behavioral data in the behavior flow data by the user For data;
Second computing unit, the reverse document frequency weighted average for calculating each behavioral data characterized by vectorAnd assign it to the corresponding behavior insertion expression inquiry table V of the userc;Wherein,
In formula, ckKth behavioral data in expression behavior flow data;Represent the behavior number characterized after conversion by vector According to;
In formula, idf (ck) represent behavior ckInverse document frequency;U represents that user gathers;ujRepresent jth user;
Second initialization unit, for according toInitialize convolution kernel Whc, according toInitialize the weight W of pond layer and shot and long term memory networkhi, According toInitialize shot and long term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, according to the weight W of w~U (- 0.05,0.05) initialization probability output layeru
Preferably, the sample collection unit 430 includes:
Negative sample collecting unit, for by distributionThe k negative sample of sampling from the behavioral data stream of the non-user, Wherein,
In formula, nsThe number of times occurred for behavior sequence s in the positive sample;ns,uExpression behavior s occurs in user u Number of times;For user uiBehavior sequence S set,For the behavior flow data of the non-user The number of the behavior sequence included,Of the behavior sequence included for the behavior flow data of the user Number.
Finally, in addition it is also necessary to explanation, herein, such as first and second or the like relational terms be used merely to by One entity or operation make a distinction with another entity or operation, and not necessarily require or imply these entities or operation Between there is any this actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant meaning Covering including for nonexcludability, so that process, method, article or equipment including a series of key elements not only include that A little key elements, but also other key elements including being not expressly set out, or also include be this process, method, article or The intrinsic key element of equipment.In the absence of more restrictions, the key element limited by sentence "including a ...", is not arranged Except also there is other identical element in the process including the key element, method, article or equipment.
The embodiment of each in this specification is described by the way of progressive, and what each embodiment was stressed is and other Between the difference of embodiment, each embodiment identical similar portion mutually referring to.
The foregoing description of the disclosed embodiments, enables professional and technical personnel in the field to realize or use the application. A variety of modifications to these embodiments will be apparent for those skilled in the art, as defined herein General Principle can in other embodiments be realized in the case where not departing from spirit herein or scope.Therefore, the application The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one The most wide scope caused.

Claims (9)

1. a kind of Method for Masquerade Intrusion Detection based on deep neural network, it is characterised in that methods described includes:
Obtain behavior flow data of at least two users in preset time period;
Hyper parameter collection and parameter set, initialization hyper parameter collection and parameter set are initialized, the hyper parameter collection includes:Behavior is embedded in table Show the behavior capacity N of inquiry tablec, behavior insertion represent dimension dc, convolution kernel size lc, the dimension d that exports after convolutionhc, Chi Huayin Sub- lp, shot and long term Memory Neural Networks hidden layer neuron number dhr;The parameter set includes:The corresponding behavior of each user is embedding Enter to represent inquiry table Vc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, shot and long term memory network t it is hidden Hide layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu
For each user, positive sample is gathered from the behavior flow data of the user, from the behavior stream of the non-user Negative sample, the positive sample and negative sample composition training dataset are gathered in data;
Using the training dataset, loss function and optimized algorithm, the parameter of parameters in the parameter set is calculated successively Value;
Inquiry table V is represented from the corresponding behavior insertion of the usercIt is middle to inquire about behavior corresponding with the behavior sequence to be detected of the user Data, and the behavioral data is added in the corresponding behavior insertion expression sequence of the user;
Sequence, which carries out the corresponding convolution algorithm of convolutional neural networks, to be represented to behavior insertion, abstract behavior sequence is obtained;
The corresponding pond computing of convolutional neural networks is carried out to the abstract behavior sequence, the first behavior sequence is obtained;
The corresponding computing of shot and long term memory artificial neural network is carried out to first behavior sequence, the second behavior sequence is obtained;
Calculate the probability that second behavior sequence is normal behaviour sequence;
Judge whether the probability is more than default probability threshold value, if being more than, it is determined that the user behavior is normal, if being less than, It is camouflage intrusion behavior to determine the user behavior.
2. the method as described in claim 1, it is characterised in that row of the user of acquisition at least two in preset time period For data, including:
Obtain behavior record data of at least two users in preset time period, the record form of the behavior record data For:Customer identification number, behavior description, date;
For each user, the behavior record data of the user are temporally ranked up, the behavioral data pair of the user is formed As stream;
Data cutout, the behavior number after being intercepted are carried out to the behavioral data object data stream according to default sliding time window According to object data stream, the data object flow after the interception includes multiple behavior sequences;
The extraction of behavior trunk and date filtering are carried out to the behavioral data object data stream after the interception, the behavior stream of the user is obtained Data.
3. method as claimed in claim 1 or 2, it is characterised in that the initialization parameter includes:
Initialize the corresponding behavior insertion of each user and represent inquiry table Vc, including:
Behavioral data in the behavior flow data of the user is converted to the behavioral data characterized by vector;
Calculate the reverse document frequency weighted average of each behavioral data characterized by vectorAnd assign it to the user couple The behavior insertion answered represents inquiry table Vc;Wherein,
v c k | i n i t = i d f ( c k ) . v c k | w 2 v ;
In formula, ckKth behavioral data in expression behavior flow data;Represent the behavioral data characterized by vector;
i d f ( c k ) = l o g | U | 1 + | { j : c k ∈ u j } |
In formula, idf (ck) represent behavioral data ckInverse document frequency;U represents that user gathers;ujRepresent jth user;
Setting weight w, which is obeyed, is uniformly distributed U [α, b], initializes the convolution kernel Whc, pond layer and shot and long term memory network power Weight Whi, shot and long term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu, its In,
According toInitialize convolution kernel Whc, according to Initialize the weight W of pond layer and shot and long term memory networkhi, according toInitialize shot and long term memory Network t hidden layer and t+1 moment hidden layer connection weights Whr, according toInitialization probability output layer Weight Wu
4. method as claimed in claim 2, it is characterised in that negative sample is gathered in the behavioral data from the non-user Originally include:
By distributionThe k negative sample of sampling from the behavioral data flow data of the non-user, wherein,
p n u ( s ) = n s - n s , u Σ u i ∈ { U - u } Σ s j ∈ s u i n s j , u i - Σ s j ∈ s u n s j , u
In formula, nsThe number of times occurred for behavior sequence s in the positive sample;ns,uTime that expression behavior s occurs in user u Number;For user uiBehavior sequence S set,To be wrapped in the behavior flow data of the non-user The number of the behavior sequence included,The number of the behavior sequence included for the behavior flow data of the user.
5. the method as described in claim 1, it is characterised in that described to represent inquiry table V from the corresponding behavior insertion of the userc Corresponding with the behavior sequence to be detected of the user behavioral data of middle inquiry includes:
The behavioral data in behavior sequence to be detected is obtained successively, and represents inquiry table V in the corresponding behavior insertion of the usercIn Inquiry behavioral data corresponding with the behavioral data in the behavior sequence to be detected;
If not finding, behavior insertion is represented into inquiry table VcIn be located at the behavioral data of first position as with institute State the corresponding behavioral data of behavioral data in behavior sequence to be detected.
6. a kind of masquerade intrusion detection device based on deep neural network, it is characterised in that described device includes:
Acquiring unit, for obtaining behavior flow data of at least two users in preset time period;
Initialization unit, for initializing hyper parameter collection and parameter set, initialization hyper parameter collection and parameter set, the hyper parameter collection Including:Behavior insertion represents the behavior capacity N of inquiry tablec, behavior insertion represent dimension dc, convolution kernel size lc, export after convolution Dimension dhc, pond factor lp, shot and long term Memory Neural Networks hidden layer neuron number dhr;The parameter set includes:Each The corresponding behavior insertion of user represents inquiry table Vc, convolution kernel Whc, pond layer and shot and long term memory network weight Whi, shot and long term Memory network t hidden layer and t+1 moment hidden layer connection weights Whr, probability output layer weight Wu
Sample collection unit, for for each user, positive sample to be gathered from the behavior flow data of the user, is somebody's turn to do from non- Negative sample, the positive sample and negative sample composition training dataset are gathered in the behavior flow data of user;
Parameter value calculation unit, for according to the training dataset, loss function and optimized algorithm, the parameter to be calculated successively Concentrate the parameter value of parameters;
Adding device is inquired about, for representing inquiry table V from the corresponding behavior insertion of the usercMiddle inquiry is to be detected with the user's The corresponding behavioral data of behavior sequence, and the behavioral data is added in the corresponding behavior insertion expression sequence of the user;
Convolution unit, for representing that sequence carries out the corresponding convolution algorithm of convolutional neural networks to behavior insertion, is taken out As behavior sequence;
Pond unit, for carrying out the corresponding pond computing of convolutional neural networks to the abstract behavior sequence, obtains the first row For sequence;
Shot and long term remembers artificial neural network arithmetic element, for carrying out the artificial god of shot and long term memory to first behavior sequence Through the corresponding computing of network, the second behavior sequence is obtained;
First computing unit, for calculating the probability that second behavior sequence is normal behaviour sequence;
Identifying unit, for judging whether the probability is more than default probability threshold value, if being more than, it is determined that the user behavior is just Often, if being less than, it is determined that the user behavior is camouflage intrusion behavior.
7. the device as shown in claim 6, it is characterised in that the acquiring unit includes:
Subelement is obtained, for obtaining behavior record data of at least two users in preset time period, the behavior record The record form of data is:Customer identification number, behavior description, date;
Sequencing unit, for for each user, the behavior record data of the user being temporally ranked up, the user is formed Behavioral data object data stream;
Interception unit, for carrying out data cutout to the behavioral data object data stream according to default sliding time window, is obtained Behavioral data object data stream after interception, the behavioral data object data stream includes multiple behavior sequences;
Determining unit, for carrying out the extraction of behavior trunk and date filtering to the behavioral data object data stream after the interception, is obtained The behavior flow data of the user.
8. device as claimed in claim 6, it is characterised in that the initialization unit includes:
First initialization unit, inquiry table Vc is represented for initializing the corresponding behavior insertion of each user, including:
Converting unit, the behavior number characterized by vector is converted to for the behavioral data in the behavior flow data by the user According to;
Second computing unit, the reverse document frequency weighted average for calculating each behavioral data characterized by vectorAnd Assign it to the corresponding behavior insertion of the user and represent inquiry table Vc;Wherein,
v c k | i n i t = i d f ( c k ) . v c k | w 2 v ;
In formula, ckKth behavioral data in expression behavior flow data;Represent the behavioral data characterized after conversion by vector;
i d f ( c k ) = l o g | U | 1 + | { j : c k ∈ u j } |
In formula, idf (ck) represent behavior ckInverse document frequency;U represents that user gathers;ujRepresent jth user;
Second initialization unit, for according toInitialize convolution kernel Whc, according toInitialize the weight W of pond layer and shot and long term memory networkhi, according toInitialize shot and long term memory network t hidden layer and t+1 moment hidden layer connection weights Whr, According toThe weight W of initialization probability output layeru
9. device as claimed in claim 7, it is characterised in that the sample collection unit includes:
Negative sample collecting unit, for by distributionThe k negative sample of sampling from the behavioral data stream of the non-user, wherein,
p n u ( s ) = n s - n s , u Σ u i ∈ { U - u } Σ s j ∈ s u i n s j , u i - Σ s j ∈ s u n s j , u
In formula, nsThe number of times occurred for behavior sequence s in the positive sample;ns,uTime that expression behavior s occurs in user u Number;For user uiBehavior sequence S set,Include for the behavior flow data of the non-user Behavior sequence number,The number of the behavior sequence included for the behavior flow data of the user.
CN201710208301.7A 2017-03-31 2017-03-31 Disguised intrusion detection method and device based on deep neural network Active CN106951783B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710208301.7A CN106951783B (en) 2017-03-31 2017-03-31 Disguised intrusion detection method and device based on deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710208301.7A CN106951783B (en) 2017-03-31 2017-03-31 Disguised intrusion detection method and device based on deep neural network

Publications (2)

Publication Number Publication Date
CN106951783A true CN106951783A (en) 2017-07-14
CN106951783B CN106951783B (en) 2021-06-01

Family

ID=59474204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710208301.7A Active CN106951783B (en) 2017-03-31 2017-03-31 Disguised intrusion detection method and device based on deep neural network

Country Status (1)

Country Link
CN (1) CN106951783B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107729927A (en) * 2017-09-30 2018-02-23 南京理工大学 A kind of mobile phone application class method based on LSTM neutral nets
CN107992746A (en) * 2017-12-14 2018-05-04 华中师范大学 Malicious act method for digging and device
CN108304933A (en) * 2018-01-29 2018-07-20 北京师范大学 A kind of complementing method and complementing device of knowledge base
CN108684043A (en) * 2018-05-15 2018-10-19 南京邮电大学 The abnormal user detection method of deep neural network based on minimum risk
CN108898015A (en) * 2018-06-26 2018-11-27 暨南大学 Application layer dynamic intruding detection system and detection method based on artificial intelligence
CN109067773A (en) * 2018-09-10 2018-12-21 成都信息工程大学 A kind of vehicle-mounted CAN network inbreak detection method neural network based and system
CN109284829A (en) * 2018-09-25 2019-01-29 艾凯克斯(嘉兴)信息科技有限公司 Recognition with Recurrent Neural Network based on evaluation network
CN109302410A (en) * 2018-11-01 2019-02-01 桂林电子科技大学 A kind of internal user anomaly detection method, system and computer storage medium
CN109561084A (en) * 2018-11-20 2019-04-02 四川长虹电器股份有限公司 URL parameter rejecting outliers method based on LSTM autoencoder network
CN109598128A (en) * 2018-12-11 2019-04-09 郑州云海信息技术有限公司 A kind of method and device of scanography
CN109685314A (en) * 2018-11-20 2019-04-26 中国电力科学研究院有限公司 A kind of non-intruding load decomposition method and system based on shot and long term memory network
GB2568965A (en) * 2017-12-04 2019-06-05 British Telecomm Software container application security
CN110084356A (en) * 2018-01-26 2019-08-02 北京深鉴智能科技有限公司 A kind of deep neural network data processing method and device
CN110378430A (en) * 2019-07-23 2019-10-25 广东工业大学 A kind of method and system of the network invasion monitoring based on multi-model fusion
CN111126515A (en) * 2020-03-30 2020-05-08 腾讯科技(深圳)有限公司 Model training method based on artificial intelligence and related device
CN111772422A (en) * 2020-06-12 2020-10-16 广州城建职业学院 Intelligent crib
CN112688946A (en) * 2020-12-24 2021-04-20 工业信息安全(四川)创新中心有限公司 Method, module, storage medium, device and system for constructing abnormality detection features
CN112995331A (en) * 2021-03-25 2021-06-18 绿盟科技集团股份有限公司 User behavior threat detection method and device and computing equipment
CN113906704A (en) * 2019-05-30 2022-01-07 诺基亚技术有限公司 Learning in a communication system
WO2022011977A1 (en) * 2020-07-15 2022-01-20 中国科学院深圳先进技术研究院 Network anomaly detection method and system, terminal and storage medium
US11860994B2 (en) 2017-12-04 2024-01-02 British Telecommunications Public Limited Company Software container application security

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870751A (en) * 2012-12-18 2014-06-18 中国移动通信集团山东有限公司 Method and system for intrusion detection
CN105224872A (en) * 2015-09-30 2016-01-06 河南科技大学 A kind of user's anomaly detection method based on neural network clustering
CN105844239A (en) * 2016-03-23 2016-08-10 北京邮电大学 Method for detecting riot and terror videos based on CNN and LSTM
CN105869016A (en) * 2016-03-28 2016-08-17 天津中科智能识别产业技术研究院有限公司 Method for estimating click through rate based on convolution neural network
CN106411597A (en) * 2016-10-14 2017-02-15 广东工业大学 Network traffic abnormality detection method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103870751A (en) * 2012-12-18 2014-06-18 中国移动通信集团山东有限公司 Method and system for intrusion detection
CN105224872A (en) * 2015-09-30 2016-01-06 河南科技大学 A kind of user's anomaly detection method based on neural network clustering
CN105844239A (en) * 2016-03-23 2016-08-10 北京邮电大学 Method for detecting riot and terror videos based on CNN and LSTM
CN105869016A (en) * 2016-03-28 2016-08-17 天津中科智能识别产业技术研究院有限公司 Method for estimating click through rate based on convolution neural network
CN106411597A (en) * 2016-10-14 2017-02-15 广东工业大学 Network traffic abnormality detection method and system

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107729927A (en) * 2017-09-30 2018-02-23 南京理工大学 A kind of mobile phone application class method based on LSTM neutral nets
CN107729927B (en) * 2017-09-30 2020-12-18 南京理工大学 LSTM neural network-based mobile phone application classification method
GB2568965B (en) * 2017-12-04 2020-05-27 British Telecomm Software container application security
GB2568965A (en) * 2017-12-04 2019-06-05 British Telecomm Software container application security
US11860994B2 (en) 2017-12-04 2024-01-02 British Telecommunications Public Limited Company Software container application security
CN107992746A (en) * 2017-12-14 2018-05-04 华中师范大学 Malicious act method for digging and device
CN110084356A (en) * 2018-01-26 2019-08-02 北京深鉴智能科技有限公司 A kind of deep neural network data processing method and device
CN110084356B (en) * 2018-01-26 2021-02-02 赛灵思电子科技(北京)有限公司 Deep neural network data processing method and device
CN108304933A (en) * 2018-01-29 2018-07-20 北京师范大学 A kind of complementing method and complementing device of knowledge base
CN108684043A (en) * 2018-05-15 2018-10-19 南京邮电大学 The abnormal user detection method of deep neural network based on minimum risk
CN108898015B (en) * 2018-06-26 2021-07-27 暨南大学 Application layer dynamic intrusion detection system and detection method based on artificial intelligence
CN108898015A (en) * 2018-06-26 2018-11-27 暨南大学 Application layer dynamic intruding detection system and detection method based on artificial intelligence
CN109067773A (en) * 2018-09-10 2018-12-21 成都信息工程大学 A kind of vehicle-mounted CAN network inbreak detection method neural network based and system
CN109067773B (en) * 2018-09-10 2020-10-27 成都信息工程大学 Vehicle-mounted CAN network intrusion detection method and system based on neural network
CN109284829A (en) * 2018-09-25 2019-01-29 艾凯克斯(嘉兴)信息科技有限公司 Recognition with Recurrent Neural Network based on evaluation network
CN109302410B (en) * 2018-11-01 2021-06-08 桂林电子科技大学 Method and system for detecting abnormal behavior of internal user and computer storage medium
CN109302410A (en) * 2018-11-01 2019-02-01 桂林电子科技大学 A kind of internal user anomaly detection method, system and computer storage medium
CN109685314B (en) * 2018-11-20 2021-10-29 中国电力科学研究院有限公司 Non-intrusive load decomposition method and system based on long-term and short-term memory network
CN109685314A (en) * 2018-11-20 2019-04-26 中国电力科学研究院有限公司 A kind of non-intruding load decomposition method and system based on shot and long term memory network
CN109561084A (en) * 2018-11-20 2019-04-02 四川长虹电器股份有限公司 URL parameter rejecting outliers method based on LSTM autoencoder network
CN109598128A (en) * 2018-12-11 2019-04-09 郑州云海信息技术有限公司 A kind of method and device of scanography
CN113906704A (en) * 2019-05-30 2022-01-07 诺基亚技术有限公司 Learning in a communication system
CN110378430A (en) * 2019-07-23 2019-10-25 广东工业大学 A kind of method and system of the network invasion monitoring based on multi-model fusion
CN110378430B (en) * 2019-07-23 2023-07-25 广东工业大学 Network intrusion detection method and system based on multi-model fusion
CN111126515A (en) * 2020-03-30 2020-05-08 腾讯科技(深圳)有限公司 Model training method based on artificial intelligence and related device
CN111772422A (en) * 2020-06-12 2020-10-16 广州城建职业学院 Intelligent crib
WO2022011977A1 (en) * 2020-07-15 2022-01-20 中国科学院深圳先进技术研究院 Network anomaly detection method and system, terminal and storage medium
CN112688946A (en) * 2020-12-24 2021-04-20 工业信息安全(四川)创新中心有限公司 Method, module, storage medium, device and system for constructing abnormality detection features
CN112688946B (en) * 2020-12-24 2022-06-24 工业信息安全(四川)创新中心有限公司 Method, module, storage medium, device and system for constructing abnormality detection features
CN112995331A (en) * 2021-03-25 2021-06-18 绿盟科技集团股份有限公司 User behavior threat detection method and device and computing equipment
CN112995331B (en) * 2021-03-25 2022-11-22 绿盟科技集团股份有限公司 User behavior threat detection method and device and computing equipment

Also Published As

Publication number Publication date
CN106951783B (en) 2021-06-01

Similar Documents

Publication Publication Date Title
CN106951783A (en) A kind of Method for Masquerade Intrusion Detection and device based on deep neural network
CN109639710B (en) Network attack defense method based on countermeasure training
Ang et al. RSPOP: Rough set–based pseudo outer-product Fuzzy rule identification algorithm
CN110245801A (en) A kind of Methods of electric load forecasting and system based on combination mining model
CN108427921A (en) A kind of face identification method based on convolutional neural networks
CN108427985A (en) A kind of plug-in hybrid vehicle energy management method based on deeply study
CN108197648A (en) A kind of Fault Diagnosis Method of Hydro-generating Unit and system based on LSTM deep learning models
CN102622515B (en) A kind of weather prediction method
CN104598611B (en) The method and system being ranked up to search entry
CN108566627A (en) A kind of method and system identifying fraud text message using deep learning
CN110321361A (en) Test question recommendation and judgment method based on improved LSTM neural network model
CN108596327A (en) A kind of seismic velocity spectrum artificial intelligence pick-up method based on deep learning
CN107832718A (en) Finger vena anti false authentication method and system based on self-encoding encoder
CN109523021A (en) A kind of dynamic network Structure Prediction Methods based on long memory network in short-term
CN113111349B (en) Backdoor attack defense method based on thermodynamic diagram, reverse engineering and model pruning
Ponnambalam et al. Minimizing variance of reservoir systems operations benefits using soft computing tools
CN115017511A (en) Source code vulnerability detection method and device and storage medium
CN116703624A (en) Block chain-based enterprise operation diagnosis system and method
Gultom et al. Application of The Levenberg Marquardt Method In Predict The Amount of Criminality in Pematangsiantar City
CN113298131B (en) Attention mechanism-based time sequence data missing value interpolation method
CN114596726A (en) Parking position prediction method based on interpretable space-time attention mechanism
CN108470212A (en) A kind of efficient LSTM design methods that can utilize incident duration
CN112989354A (en) Attack detection method based on neural network and focus loss
CN115174268B (en) Intrusion detection method based on structured regular term
CN116632834A (en) Short-term power load prediction method based on SSA-BiGRU-Attention

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant