CN111131314A - Network behavior detection method and device, computer equipment and storage medium - Google Patents

Network behavior detection method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN111131314A
CN111131314A CN201911418725.1A CN201911418725A CN111131314A CN 111131314 A CN111131314 A CN 111131314A CN 201911418725 A CN201911418725 A CN 201911418725A CN 111131314 A CN111131314 A CN 111131314A
Authority
CN
China
Prior art keywords
behavior
image
detected
historical
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911418725.1A
Other languages
Chinese (zh)
Other versions
CN111131314B (en
Inventor
王占一
马江波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qax Technology Group Inc
Secworld Information Technology Beijing Co Ltd
Original Assignee
Qianxin Technology Group Co Ltd
Secworld Information Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qianxin Technology Group Co Ltd, Secworld Information Technology Beijing Co Ltd filed Critical Qianxin Technology Group Co Ltd
Priority to CN201911418725.1A priority Critical patent/CN111131314B/en
Publication of CN111131314A publication Critical patent/CN111131314A/en
Application granted granted Critical
Publication of CN111131314B publication Critical patent/CN111131314B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1416Event detection, e.g. attack signature detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a method and a device for detecting network behaviors, computer equipment and a storage medium. The network behavior detection method comprises the following steps: acquiring behavior data of the network behavior of a target main body in a unit time window to be detected and mapping the behavior data into a behavior image to obtain an image to be detected; acquiring behavior data of network behaviors of a target subject in a historical unit time window and mapping the behavior data into a behavior image to obtain a historical image; extracting a characteristic vector of an image to be detected to obtain a characteristic vector to be detected, and extracting a characteristic vector of a historical image to obtain a historical characteristic vector; inputting the historical feature vector group into a preset behavior prediction model to obtain a prediction feature vector; and comparing the characteristic vector to be detected with the predicted characteristic vector to determine whether the network behavior of the target main body in the unit time window to be detected is abnormal. By the method and the device, the accuracy of detecting the unknown abnormal behavior can be improved.

Description

Network behavior detection method and device, computer equipment and storage medium
Technical Field
The present invention relates to the field of abnormal network behavior detection technologies, and in particular, to a method and an apparatus for detecting a network behavior, a computer device, and a storage medium.
Background
In recent years, network security events are continuously generated all over the world, and the network security problem is increasingly highlighted. With the explosive growth trend of related data volume, the network attack mode is more and more complex and changeable, in terms of defense, the discovery and treatment aiming at the known threats are more and more marginal, while the methods are often few in the face of unknown threats, most of the methods need to be analyzed by depending on the experience of security personnel and existing tools or products, and the methods cannot meet the requirements of the industry.
Meanwhile, the artificial intelligence technology is rapidly developed, and the machine learning and branch deep learning technology makes a great breakthrough in the fields of computer vision, speech recognition, natural language processing and the like. More and more people in academia and industry are beginning to utilize artificial intelligence techniques in an attempt to solve problems in network security. The artificial intelligence can rapidly detect millions, millions or even hundreds of millions of events by virtue of the advantages of automation, intellectualization, large-scale operational capability and the like so as to discover security threats.
In the prior art, when detecting abnormal network behaviors based on artificial intelligence, characteristics of network behaviors are usually extracted according to artificial experience, specifically, characteristics are defined from multiple dimensions such as IP, domain name, UA, time and the like by experts through experience, the characteristics are spliced into characteristic vectors after extraction, and an artificial intelligence model detects network behaviors based on the characteristic vectors.
However, the detection network not only consumes manpower cost, but also has certain hysteresis in manual experience due to the constant change of behavior models and security events, so that coverage is often incomplete, the capability of coping with unknown threats is poor, and the detection accuracy of unknown abnormal behaviors is not high.
Therefore, it is an urgent technical problem to be solved in the art to provide a method and an apparatus for detecting network behavior, a computer device, and a storage medium, so as to improve the accuracy of detecting unknown abnormal behavior.
Disclosure of Invention
The present invention is directed to a method, an apparatus, a computer device, and a storage medium for detecting network behavior, so as to solve the above technical problems in the prior art.
In one aspect, the present invention provides a method for detecting network behavior.
The network behavior detection method comprises the following steps: acquiring behavior data of the network behavior of a target main body in a unit time window to be detected to obtain a behavior data group to be detected; acquiring behavior data of network behaviors of a target subject in a historical unit time window to obtain a historical behavior data set; mapping a behavior data group to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and mapping a historical behavior data group into the behavior image according to the mapping rule to obtain a historical image; extracting a characteristic vector of an image to be detected to obtain a characteristic vector to be detected, and extracting a characteristic vector of a historical image to obtain a historical characteristic vector; inputting a historical feature vector group into a preset behavior prediction model to obtain a prediction feature vector, wherein the historical feature vector group comprises a plurality of continuous historical feature vectors corresponding to historical unit time windows; and comparing the characteristic vector to be detected with the predicted characteristic vector to determine whether the network behavior of the target main body in the unit time window to be detected is abnormal.
Further, the step of extracting the feature vector of the behavior image includes: establishing an initial self-encoder, wherein the initial self-encoder comprises an input layer, an encoding layer, an anti-encoding layer and an output layer; acquiring a training behavior image, respectively taking the training behavior image as the input of an input layer and the output of an output layer of an initial self-encoder, and training the initial self-encoder to obtain a target self-encoder; acquiring a verification behavior image, and taking the verification behavior image as the input of an input layer of a target self-encoder to obtain the output of an output layer of the target self-encoder; comparing the verification behavior image with the output of the output layer of the target self-encoder to judge whether the target self-encoder meets the requirements or not; and when the target self-encoder meets the requirements, taking the behavior image as the input of the input layer of the target self-encoder, and acquiring the output of the coding layer of the target self-encoder to obtain the characteristic vector of the behavior image.
Further, before the step of inputting the historical feature vectors corresponding to a plurality of continuous historical unit time windows into a preset behavior prediction model to obtain the predicted feature vectors, the method for detecting the network behavior further includes: acquiring network data of a target subject; extracting a behavior data group in the network data, wherein the behavior data group comprises behavior data of network behaviors of a target subject in a unit time window; mapping the behavior data set into a behavior image according to a mapping rule to obtain a training behavior image; extracting a feature vector of the training behavior image to obtain a training feature vector, wherein the training feature vector is a matrix of L x 1; sequencing a plurality of training feature vectors corresponding to the network data according to the time sequence to obtain a training feature vector queue; adopting a sliding window mode to carry out value taking and splicing on the training feature vector queue to obtain a plurality of training samples, wherein the training samples comprise spliced vectors and prediction vectors, the spliced vectors comprise L-T matrixes formed by T training feature vectors in the sliding window, and the prediction vectors comprise training feature vectors adjacent to and behind the sliding window; establishing an initial behavior prediction model, wherein the initial behavior prediction model comprises an input layer, an intermediate layer and an output layer; and taking the spliced vector in the training sample as the input of the input layer of the initial behavior prediction model, taking the prediction vector in the training sample as the output of the output layer of the initial behavior prediction model, and training the initial behavior prediction model to obtain the behavior prediction model.
Further, the step of comparing the feature vector to be detected with the predicted feature vector to determine whether the target subject network behavior in the unit time window to be detected is abnormal includes: calculating the root mean square error of the feature vector to be detected and the predicted feature vector; judging whether the root mean square error is larger than a preset error threshold value; and when the root mean square error is larger than a preset error threshold, determining that the network behavior of the target subject in the unit time window to be detected is abnormal.
Further, the method for detecting network behavior further includes: and generating an alarm when the network behavior of the target main body in the unit time windows to be detected is abnormal in the unit time windows to be detected, wherein r is less than or equal to s.
Further, the behavior data includes a behavior opposite end, a behavior attribute and a behavior time, the behavior attribute is an attribute of a network behavior generated between the target subject and the behavior opposite end, and the step of mapping the behavior data set into an image according to a preset mapping rule includes: for each piece of behavior data, mapping a behavior opposite end into a position coordinate of a point in a preset template image, mapping a behavior attribute into a morphological attribute of the point, and displaying the point of the morphological attribute at the position coordinate to obtain an image point corresponding to the behavior data; and arranging a connecting line between the association points, wherein the image points corresponding to the two pieces of behavior data with the behavior time satisfying the preset association relationship are the association points.
Further, the association point is an image point corresponding to two pieces of behavior data adjacent in behavior time; the connecting line is a vector, and the direction of the vector represents the sequence of the behavior time corresponding to the associated point.
Further, the step of mapping the behavior data set into an image according to a preset mapping rule further includes: counting the number of the same associated points corresponding to the behavior data group; mapping the number to a morphological attribute of the line, the step of setting the connection line between the associated points comprising: and connecting lines are arranged between the associated points according to the shape attribute of the lines.
Further, the step of mapping the behavior opposite end to the position coordinate of the midpoint in the preset template image comprises: presetting position coordinates of a plurality of points in a preset template image; allocating an end identifier to the position coordinates of a plurality of preset points through sequential allocation, random allocation or farthest distance allocation, and establishing a one-to-one corresponding first mapping relation between the position coordinates and the end identifier; and determining the position coordinate corresponding to the behavior opposite terminal according to the terminal identifier of the behavior opposite terminal and the first mapping relation.
Further, the step of mapping the behavior attribute to the shape attribute of the dot includes: determining the number i of the behavior attribute; determining the RGB value range of the pixel, wherein the R value range of the pixel is 0-R, the G value range of the pixel is 0-G, and the B value range of the pixel is 0-B; calculating the hexadecimal number corresponding to the serial number i by adopting the following formula
Figure BDA0002351816620000041
Wherein the content of the first and second substances,
Figure BDA0002351816620000042
represents rounding down, c represents the number of categories of behavior attributes, HEX () represents the conversion of a binary number into a hexadecimal number; and converting every two digits of the hexadecimal number x into a decimal number to obtain the RGB values of the pixels of the dot.
Further, the morphological attribute of the line is an RGB value of a pixel of the line, and the step of mapping the number to the morphological attribute of the line includes: converting the number from a decimal number to a hexadecimal number; each two digits of the hexadecimal number are converted into decimal numbers to obtain the RGB values of the pixels of the line.
In another aspect, the present invention provides a device for detecting network behavior.
The network behavior detection device comprises: the first acquisition module is used for acquiring behavior data of the network behavior of the target main body in the unit time window to be detected to obtain a behavior data group to be detected; the second acquisition module is used for acquiring behavior data of the network behavior of the target subject in the historical unit time window to obtain a historical behavior data set; the mapping module is used for mapping the behavior data group to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and mapping the historical behavior data group into the behavior image according to the mapping rule to obtain a historical image; the extraction module is used for extracting the characteristic vector of the image to be detected to obtain the characteristic vector to be detected, and extracting the characteristic vector of the historical image to obtain the historical characteristic vector; the prediction module is used for inputting a historical feature vector group into a preset behavior prediction model to obtain a prediction feature vector, wherein the historical feature vector group is a plurality of historical feature vectors corresponding to continuous historical unit time windows; and the comparison module is used for comparing the characteristic vector to be detected with the predicted characteristic vector so as to determine whether the network behavior of the target main body in the unit time window to be detected is abnormal.
To achieve the above object, the present invention also provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above method when executing the computer program.
To achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the above method.
The invention provides a method, a device, computer equipment and a storage medium for detecting network behaviors, which are characterized in that when the network behaviors of a target main body in a certain time period are detected, the time period is taken as a unit time window to be detected, behavior data of the network behaviors of the target main body in the unit time window to be detected are obtained, a behavior data group to be detected is obtained, behavior data of the network behaviors of the target main body in a plurality of historical unit time windows before the unit time window to be detected are obtained, historical behavior data groups corresponding to each historical unit time window are obtained, image mapping and feature vector extraction are carried out on the behavior data group to be detected, feature vectors to be detected are obtained, the same image mapping and feature vector extraction are carried out on each historical behavior data group to obtain historical feature vectors, a plurality of historical feature vectors are utilized, and a preset behavior prediction model is passed, and finally, comparing the predicted characteristic vector with the characteristic vector to be detected corresponding to the network behavior of the target body actually occurring in the unit time window to be detected, and determining whether the network behavior of the target body actually occurring in the unit time window to be detected is abnormal or not based on the deviation of the predicted characteristic vector and the characteristic vector to be detected corresponding to the network behavior of the target body actually occurring in the unit time window to be detected. It can be seen that, according to the invention, the behavior data set is mapped into the image, the abnormal behavior detection is realized based on the extraction and identification of the feature vector of the image, the theoretically normal network behavior is predicted by using the normal historical network behavior, and then the predicted behavior is compared with the real behavior to determine whether the real behavior is the abnormal behavior.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a flowchart of a method for detecting network behavior according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating an abnormal behavior detection system according to a second embodiment of the present invention;
FIG. 3 is a schematic diagram of a preset template image according to a second embodiment of the present invention;
fig. 4 to fig. 5 are diagrams illustrating the effect of processing network data into images according to a second embodiment of the present invention;
fig. 6 is a schematic diagram illustrating feature combinations and training of normal behavior models in the network behavior detection method according to the second embodiment of the present invention;
fig. 7 is a schematic diagram illustrating a normal behavior prediction process in the network behavior detection method according to the second embodiment of the present invention;
fig. 8 is a block diagram of a network behavior detection apparatus according to a third embodiment of the present invention;
fig. 9 is a hardware configuration diagram of a computer device according to a fourth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to improve the accuracy of detecting unknown abnormal behaviors, the invention provides a method, a device, computer equipment and a storage medium for detecting network behaviors, in the detection method, when the network behaviors of a target main body in a target time period are detected, the time period is used as a unit time window to be detected, the behavior data of the network behaviors of the target main body in the time window is obtained, a behavior data group to be detected is obtained, then the behavior data of the network behaviors of the target main body in a historical unit time window is obtained, a historical behavior data group is obtained, the behavior data group to be detected is mapped into a behavior image according to a preset mapping rule, an image to be detected is obtained, the historical behavior data group is mapped into the behavior image according to the mapping rule, a historical image is obtained, a feature vector of the image to be detected is extracted, a feature vector of the image to be detected is obtained, and a feature vector of the historical image, obtaining historical characteristic vectors, inputting the historical characteristic vectors corresponding to a plurality of continuous historical unit time windows into a preset behavior prediction model to obtain predicted characteristic vectors, wherein the predicted characteristic vectors are also predicted theoretical characteristic vectors in the target time period, finally, comparing the characteristic vectors to be detected with the predicted characteristic vectors to determine whether the network behavior of the target main body in the unit time window to be detected is abnormal, and when the characteristic vectors to be detected deviate from the predicted characteristic vectors and exceed normal deviation, indicating that the network behavior of the target main body in the unit time window to be detected is abnormal. The method comprises the steps of mapping a behavior data set into an image, extracting and identifying a characteristic vector based on the image to realize abnormal behavior detection, specifically predicting the characteristic vector corresponding to theoretical behavior in a detection time period by using the characteristic vector corresponding to historical network behavior, and then comparing the predicted theoretical data with real data to determine whether the real behavior is abnormal behavior.
Specific embodiments of the method, apparatus, computer device and storage medium for detecting network behavior provided by the present invention will be described in detail below.
Example one
The embodiment of the invention provides a method for detecting a network behavior, which includes the steps of imaging behavior data representing the network behavior in network data, extracting feature vectors based on the image, predicting the feature vectors corresponding to theoretical behaviors in a detection time period by using the feature vectors corresponding to historical network behaviors, and comparing the predicted theoretical data with real data to determine whether the real behaviors are abnormal behaviors, wherein the process does not need to depend on expert experience, and can detect unknown abnormal behaviors, so that the accuracy of detection of the unknown abnormal behaviors is improved.
Step S101: and acquiring behavior data of the network behavior of the target subject in the unit time window to be detected to obtain a behavior data group to be detected.
When whether the network behavior of a certain target subject is abnormal within a period of time needs to be detected, taking the period of time as a unit time window to be detected, acquiring the behavior data of the network behavior of the target subject within the unit time window to be detected, and taking the acquired behavior data as a behavior data group to be detected; optionally, each network behavior corresponds to one behavior data, and if the target subject to be detected has multiple network behaviors in the unit time window, the behavior data set to be detected includes multiple pieces of behavior data. For example, when detecting whether the network behavior of a certain IP address in a certain day is normal, the behavior data of the network behavior of the IP address in the day is obtained.
The network behavior refers to a network behavior initiated by a target main body to another main body in a communication network, optionally, the behavior data may include a receiving end of the network behavior (also called a behavior peer) and a behavior attribute of the network behavior, the behavior peer may specifically be an identifier, a number, an address, and the like of the receiving end, which may uniquely identify information of the receiving end, and the behavior attribute is an attribute of the network behavior. Network behavior includes, but is not limited to, actions involving data interactions such as connecting, logging on, querying, writing, mailing, and the like.
Optionally, when behavior data of a network behavior of a target subject in a unit time window to be detected is obtained, the behavior data may be obtained by obtaining network data such as network traffic data and network log data in the unit time window to be detected and extracting data representing the network behavior from the network data. Specifically, the network data can be obtained by packet capturing, log reading and the like.
Step S102: and acquiring behavior data of the network behavior of the target subject in the historical unit time window to obtain a historical behavior data set.
In this step S102, the time duration of the historical unit time window is equal to that of the unit time window to be detected, and the time is the historical time of the unit time window to be detected, for example, if the unit time window to be detected is 2018, 9, month, 5, day, the historical unit time window is one day before 2018, 9, month, 5, optionally, the historical unit time window is the time adjacent to the unit time window to be detected.
When acquiring the behavior data of the network behavior of the target subject in the historical unit time window, the same acquiring method as that used when acquiring the behavior data in step S101 may be used, and the historical behavior data group is the behavior data corresponding to the network behavior of the target subject in one historical unit time window.
Step S103: and mapping the behavior data group to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and mapping the historical behavior data group into the behavior image according to the mapping rule to obtain a historical image.
In step S103, the behavior data group to be detected and the historical behavior data group are respectively mapped into behavior images according to the mapping rules, that is, the behavior data are imaged, and the behavior data are represented by the features on the images, so that the network behavior can be identified by identifying the images. In the application, a behavior image obtained by mapping a behavior data group to be detected is defined as an image to be detected, and a behavior image obtained by mapping a historical behavior data group is defined as a historical image.
Step S104: extracting the characteristic vector of the image to be detected to obtain the characteristic vector to be detected, and extracting the characteristic vector of the historical image to obtain the historical characteristic vector.
In step S104, feature vectors are extracted from the image to be detected and the history image, respectively, where the feature vectors are vectors formed by image features. In the present application, the feature vector extracted from the image to be detected is defined as a feature vector to be detected, and the feature vector extracted from the history image is defined as a history feature vector.
Step S105: and inputting the historical characteristic vector group into a preset behavior prediction model to obtain a prediction characteristic vector.
The historical characteristic vector group is historical characteristic vectors corresponding to a plurality of continuous historical unit time windows.
Through the above steps S102 to S104, the historical feature vectors corresponding to a plurality of consecutive historical unit time windows can be obtained, for example, the historical feature vectors corresponding to 31 consecutive days from 8/month and 4 (none) in 2018 to 9/month and 4 (none) in 2018 are obtained.
In step S105, a preset behavior prediction model is used to predict a feature vector corresponding to a network behavior in a unit time window to be detected based on a plurality of historical feature vectors, that is, theoretical data in the unit time window to be detected is predicted based on historical data. In the application, a feature vector corresponding to the predicted network behavior in the unit time window to be detected is defined as a predicted feature vector.
Step S106: and comparing the characteristic vector to be detected with the predicted characteristic vector to determine whether the network behavior of the target main body in the unit time window to be detected is abnormal.
The predicted feature vector is a feature vector corresponding to a network behavior under a normal condition predicted based on the historical feature vector, and for an abnormal network behavior, the real feature vector deviates from the feature vector of the normal network behavior, so that the feature vector to be detected is compared with the predicted feature vector, and when the deviation between the feature vector to be detected and the predicted feature vector exceeds a preset threshold, the target main body network behavior in a unit time window to be detected is determined to be abnormal, wherein the preset threshold can be set according to a specific calculation method of the deviation.
In the method for detecting network behavior provided in this embodiment, when detecting network behavior of a target subject in a certain time period, the time period is used as a unit time window to be detected, behavior data of the network behavior of the target subject in the unit time window to be detected is obtained, a behavior data group to be detected is obtained, behavior data of network behavior of the target subject in a plurality of historical unit time windows before the unit time window to be detected is obtained, a historical behavior data group corresponding to each historical unit time window is obtained, image mapping and feature vector extraction are performed on the behavior data group to be detected, a feature vector to be detected is obtained, the same image mapping and feature vector extraction are performed on each historical behavior data group to obtain a historical feature vector, a feature vector corresponding to normal network behavior of the target subject in the unit time window to be detected is predicted through a preset prediction behavior model by using a plurality of historical feature vectors, and finally, comparing the predicted characteristic vector with the characteristic vector to be detected corresponding to the network behavior of the target main body in the unit time window to be detected, and determining whether the network behavior of the target main body in the unit time window to be detected actually is abnormal or not based on the deviation of the predicted characteristic vector and the characteristic vector to be detected. The method for detecting the network behaviors can be seen in that a behavior data set is mapped into an image, the abnormal behaviors are detected based on the extraction and identification of feature vectors of the image, the theoretically normal network behaviors are predicted by using the normal historical network behaviors, then the predicted behaviors are compared with the real behaviors to determine whether the real behaviors are the abnormal behaviors, the process does not need to depend on expert experience, the unknown abnormal behaviors can be detected, and the accuracy of the detection of the unknown abnormal behaviors can be improved compared with the detection based on the experience in the prior art.
Optionally, in an embodiment, the step of extracting the feature vector of the behavior image includes: establishing an initial self-encoder, wherein the initial self-encoder comprises an input layer, an encoding layer, an anti-encoding layer and an output layer; acquiring a training behavior image, respectively taking the training behavior image as the input of an input layer and the output of an output layer of an initial self-encoder, and training the initial self-encoder to obtain a target self-encoder; acquiring a verification behavior image, and taking the verification behavior image as the input of an input layer of a target self-encoder to obtain the output of an output layer of the target self-encoder; comparing the verification behavior image with the output of the output layer of the target self-encoder to judge whether the target self-encoder meets the requirements or not; and when the target self-encoder meets the requirements, taking the behavior image as the input of the input layer of the target self-encoder, and acquiring the output of the coding layer of the target self-encoder to obtain the characteristic vector of the behavior image.
In particular, the self-encoder is an unsupervised data dimension compression and data feature expression method. Optionally, the self-encoder in the embodiment may be a convolutional self-encoder, which utilizes an unsupervised learning manner of a conventional self-encoder and implements feature extraction by introducing operations such as convolution and pooling.
In this embodiment, an input image is received from an input layer of an encoder, the encoding layer performs encoding compression on the input image, and the de-encoding layer performs de-encoding on data output by the encoding layer and outputs the data through an output layer after the de-encoding. The training behavior image is a behavior image obtained by mapping a behavior data set corresponding to network behaviors in a period of time (which may be equal to the duration of a unit time window or not equal to the duration of the unit time window) and represents a subject (which may be a target subject or other subjects). The method comprises the steps that a verification behavior image is obtained for a target self-encoder obtained after training and is verified, if the target self-encoder can compress and approximately restore the verification behavior image, namely the verification behavior image input to the target self-encoder is small in deviation when compared with the output of an output layer of the target self-encoder, the target self-encoder meets requirements, the image obtained by mapping a behavior data group corresponding to a network behavior can be compressed and approximately restored, namely, when the deviation of the output layer of the target self-encoder and the verification behavior image is within a preset acceptable deviation range, the target self-encoder meets the requirements, wherein the acceptable deviation range can be set according to the precision requirements of an application scene.
When the feature extraction is carried out, including the feature extraction of the behavior image to be verified and the feature extraction of the historical behavior image, the behavior image is used as the input of the input layer of the target self-encoder, the output of the coding layer of the target self-encoder is obtained, the coding layer is used for coding and compressing the behavior image, and the output of the coding layer can represent the features on the behavior image, so that the feature vector to be verified and the historical feature vector can be respectively obtained as the feature vector of the behavior image.
Optionally, in an embodiment, before the step of inputting the historical feature vectors corresponding to a plurality of consecutive historical unit time windows into a preset behavior prediction model to obtain the predicted feature vectors, the method for detecting the network behavior further includes: acquiring network data of a target subject; extracting a behavior data group in the network data, wherein the behavior data group comprises behavior data of network behaviors of a target subject in a unit time window; mapping the behavior data set into a behavior image according to a mapping rule to obtain a training behavior image; extracting a feature vector of the training behavior image to obtain a training feature vector, wherein the training feature vector is a matrix of L x 1; sequencing a plurality of training feature vectors corresponding to the network data according to the time sequence to obtain a training feature vector queue; adopting a sliding window mode to carry out value taking and splicing on the training feature vector queue to obtain a plurality of training samples, wherein the training samples comprise spliced vectors and prediction vectors, the spliced vectors comprise L-T matrixes formed by T training feature vectors in the sliding window, and the prediction vectors comprise training feature vectors adjacent to and behind the sliding window; establishing an initial behavior prediction model, wherein the initial behavior prediction model comprises an input layer, an intermediate layer and an output layer; and taking the spliced vector in the training sample as the input of the input layer of the initial behavior prediction model, taking the prediction vector in the training sample as the output of the output layer of the initial behavior prediction model, and training the initial behavior prediction model to obtain the behavior prediction model.
Specifically, before a behavior prediction model is used for prediction, the model is obtained through training in a training mode, during training, network data of a target subject are obtained, a behavior data group in the network data is extracted to obtain a plurality of behavior data groups, after each behavior data group is mapped into a behavior image according to a mapping rule, feature extraction is carried out to obtain a plurality of feature vectors which are defined as training feature vectors, the training feature vectors are set to be column vectors of L dimension, namely the training feature vectors are matrixes of L1, the training feature vectors are sequenced according to time sequence to be used as queues, the queues are subjected to value taking and splicing in a sliding window mode, T training feature vectors in the sliding window are extracted to be used as input samples in training samples, and one training feature vector which is adjacent to and behind the sliding window is used as an output sample in the training samples, and performing forest patrol on the initial behavior prediction model to obtain a final behavior prediction model.
Optionally, in an embodiment, the step of comparing the feature vector to be detected and the predicted feature vector to determine whether the target subject network behavior in the unit time window to be detected is abnormal includes: calculating the root mean square error of the feature vector to be detected and the predicted feature vector; judging whether the root mean square error is larger than a preset error threshold value; and when the root mean square error is larger than a preset error threshold, determining that the network behavior of the target subject in the unit time window to be detected is abnormal.
Specifically, the deviation between the feature vector to be detected and the predicted feature vector is measured through the root mean square error, when the root mean square error is larger than a preset error threshold value, the difference between the feature vector to be detected and the predicted feature vector is larger, and it is determined that the target main network behavior in the unit time window to be detected is abnormal.
Optionally, in an embodiment, the method for detecting network behavior further includes: and generating an alarm when the network behavior of the target main body in the unit time windows to be detected is abnormal in the unit time windows to be detected, wherein r is less than or equal to s.
Specifically, in this embodiment, r and s may be preset values, and r is a tolerable number of abnormal observation unit time windows. If the service requires that a single unit time window needs to be alarmed when detecting the abnormity, then r is 1; if, although an anomaly is detected for a single unit time window, it is possible, based on normal data fluctuations and traffic needs etc., not to alarm immediately but to continue to look at a plurality of unit time windows, then r > 1. Through the embodiment, the values of r and s are set according to actual needs, false alarm in a scene with large network data fluctuation is avoided, and alarm can be generated when abnormal behaviors occur for multiple times in continuous unit time to be detected, namely when the probability of the abnormal behaviors occurring in the target main body is large.
Optionally, in an embodiment, the behavior data includes a behavior peer, a behavior attribute, and a behavior time, the behavior attribute is an attribute of a network behavior generated between the target subject and the behavior peer, and the step of mapping the behavior data set into the image according to a preset mapping rule includes: for each piece of behavior data, mapping a behavior opposite end into a position coordinate of a point in a preset template image, mapping a behavior attribute into a morphological attribute of the point, and displaying the point of the morphological attribute at the position coordinate to obtain an image point corresponding to the behavior data; and arranging a connecting line between the association points, wherein the image points corresponding to the two pieces of behavior data with the behavior time satisfying the preset association relationship are the association points.
Specifically, the behavior peer refers to a receiving end of the network behavior sent by the target subject, and the behavior peer included in the behavior data may specifically include information that can uniquely identify the receiving end, such as an identifier, a serial number, and an address of the receiving end; the behavior attribute refers to the attribute of the network behavior, including a connection mode, a data transmission protocol and the like; the action time refers to the generation time of the network action. And mapping the behavior data group into an image according to a preset mapping rule, namely imaging the network behavior. The mapping rule includes a mapping relationship established between a behavior opposite end (information for identifying the behavior opposite end in the present application) and position coordinates of points of a preset template image, and the behavior opposite end is mapped to the position coordinates of the points in the preset template image for each piece of behavior data through the mapping relationship, where it is to be noted that the position coordinates at this position may be data of a mark point position in a coordinate system, for example, data of a mark point position by a horizontal and vertical coordinate in a rectangular coordinate system, and data of a mark point position by a polar angle and a polar diameter in a polar coordinate system; the data of the point positions may also be identified by serial numbers and the like, for example, the data of the point positions are identified by the serial numbers of the pixels in the image, and the data of the point positions that can be identified in the image belong to the position coordinates in the present application, so as to achieve the purpose of representing opposite ends of different behaviors by points at different positions in the preset template image.
The mapping rule further includes a mapping relationship established between the behavior attribute and a morphological attribute of a point of the preset template image, by the mapping relation, the behavior attribute is mapped to the form attribute of the point in the preset template image aiming at each piece of behavior data, and it needs to be explained that, the behavior attribute may be a single attribute of the network behavior, such as the connection mode of the network connection, or a combination of multiple attributes of the network behavior, such as the connection mode of the network connection and the data transmission protocol, the shape attribute of the point may be the size, shape and/or color of the point, or may be other parameters of the point, and all the characteristics that can represent the difference of the point in the image belong to the shape attribute of the point of the present application, the purpose of representing different behavior attributes through different morphological attributes of points in a preset template image is achieved.
The behavior data group comprises a plurality of pieces of behavior data, each piece of behavior data corresponds to one network behavior, and the behavior data group represents data of a plurality of network behaviors initiated by the target subject within a period of time. The association relationship based on the behavior time can be preset according to actual needs, and in the behavior data set, the image points corresponding to the two pieces of behavior data whose behavior time satisfies the preset association relationship are defined as the mutual association points, for example, the image points corresponding to the two pieces of behavior data whose interval time satisfies the preset duration are the mutual association points, and the imaging of the association relationship is realized by displaying the connecting line between the association points, that is, the network behaviors satisfying the association relationship are reflected by the image information. And aiming at the behavior data group, each piece of behavior data is mapped to a preset template image, and meanwhile, connecting lines are arranged between the association points, so that the association relation of the network behaviors can be embodied through the images.
In the method for detecting a network behavior provided in this embodiment, the behavior data includes a behavior opposite end of the network behavior and a behavior attribute of the network behavior, the behavior opposite end is mapped to a position coordinate of a point in a preset template image, the behavior attribute is mapped to a morphological attribute of the point, an image point corresponding to the behavior data is obtained by marking the point displaying the morphological attribute at the position coordinate, the behavior data further includes behavior time, and an association relationship between the behavior times is embodied by a connection line provided between the association points, so that imaging of the network behavior is achieved.
Optionally, in an embodiment, the association point is an image point corresponding to two behavior data with adjacent behavior time.
By adopting the method for detecting the network behaviors provided by the embodiment, the associated points are defined as the image points corresponding to two behavior data with adjacent behavior time, and the connecting line between the associated points can represent which network behaviors are adjacent network behaviors, so that on one hand, the quantity of the information of the network behaviors represented by the images is increased, and on the other hand, for some security threats, specific characteristics exist between the adjacent network behaviors, therefore, the network behaviors are detected based on the images with the adjacent network behavior representations, and the accurate detection of the security threats of the type can be realized.
Optionally, in an embodiment, the connection line is a vector, and a direction of the vector represents a sequence of behavior time corresponding to the association point.
By adopting the method for detecting the network behavior provided by the embodiment, for the image points corresponding to the two behavior data with adjacent behavior time, not only the connecting line is displayed between the two image points (namely, the associated points), but also the connecting line is set as a line with a direction, namely, the connecting line is set as a vector, and the sequence of the behavior time is represented through the direction of the vector, so that when the image is an image representing a plurality of network behaviors, the path of the network behavior initiated by the network behavior initiating end can be represented through the image, namely, the network behavior chain can be represented through the image, the sequence relation of the series behaviors can be represented, and the upper and lower relations of the normal behaviors can be learned and predicted based on the image of the normal behaviors and compared as a reference.
Optionally, in an embodiment, the step of mapping the behavior data set to an image according to a preset mapping rule further includes: counting the number of the same associated points corresponding to the behavior data group; mapping the number to a morphological attribute of the line, the step of setting the connection line between the associated points comprising: and connecting lines are arranged between the associated points according to the shape attribute of the lines.
Specifically, two pieces of behavior data represented by the association point are used as one behavior data unit, and when the behavior data groups correspond to the same association point, that is, the behavior data groups include the same behavior data unit, in the network behavior detection method provided in this embodiment, for a certain association point, the number of the behavior data units represented by the association point in the behavior data group is counted, and if the behavior data group includes N behavior data units, the number of the association point is N. The mapping relationship between the number and the form attribute of the line is set in advance, after the number of one associated point is determined, the determined number can be mapped to the form attribute of the line based on the mapping relationship, and then when the connecting line is displayed between the associated points, the setting is performed according to the form attribute of the mapped line. The shape attribute of the line can be the shape, thickness and/or color of the line, or other parameters of the line, and all the characteristics capable of showing the difference of the line in the image belong to the shape attribute of the line of the application, so as to realize the purpose of representing the number of the associated points through the different shape attributes of the line in the preset template image.
By adopting the method for detecting the network behaviors provided by the embodiment, the number of the associated points is mapped to form attributes of the connecting lines between the associated points, so that the connecting lines between the associated points can represent the network behaviors while representing which network behaviors meet the association relationship, and the quantity of the network behaviors is increased.
Optionally, in an embodiment, the step of mapping the behavior pair to the position coordinates of the midpoint in the preset template image includes: presetting position coordinates of a plurality of points in a preset template image; allocating an end identifier to the position coordinates of a plurality of preset points through sequential allocation, random allocation or farthest distance allocation, and establishing a one-to-one corresponding first mapping relation between the position coordinates and the end identifier; and determining the position coordinate corresponding to the behavior opposite terminal according to the terminal identifier of the behavior opposite terminal and the first mapping relation.
Specifically, when the position coordinates of a plurality of points are preset in the preset template image, the plurality of points may be randomly set, or may be set according to a certain rule; when a mapping relation is established between a point and a behavior opposite terminal in a preset template image, the position coordinates of the point can be distributed to the behavior opposite terminal in sequence, the behavior opposite terminals are sequenced aiming at a plurality of behavior opposite terminals, then each point is distributed to the behavior opposite terminal in sequence in a certain direction, each point is randomly assigned to a behavior opposite end, or after a point is assigned to a behavior opposite end, then calculating and determining a point farthest from the point, assigning the point farthest from the point to the next action opposite terminal, and so on until all action opposite ends are corresponding to a point, forming a first mapping relation of one-to-one correspondence between the position coordinate and the action opposite ends, and when a determined behavior opposite end is mapped to be the position coordinate of the midpoint of the preset template image, searching the position coordinate corresponding to the behavior opposite end in the first mapping relation.
Optionally, in an embodiment, the preset template image is a picture with a plurality of pixels, a point includes a plurality of pixels, the position coordinates are determined according to the position of the pixel, and the morphological attribute of the point is determined according to the RGB values of the pixel.
Specifically, the position coordinates of the point may be the position coordinates of one pixel included in the point, or may be the serial number of one pixel, or the like. The form attribute of the point is the color of the point, and is specifically determined according to the RGB value of the pixel, and both the position coordinate and the form attribute of the point can be determined according to the pixel in the picture by using the method for detecting the network behavior provided by the embodiment.
Optionally, in an embodiment, the RGB values of the pixels of the same point are the same, so that the color of the whole point is uniform, which is beneficial to image recognition and image processing, and further beneficial to recognition and processing of network behaviors, and when the mapping relationship between the behavior attribute and the form attribute of the point is set, the form attribute of the point is simplified into one RGB value, and the mapping relationship is simple.
Optionally, in an embodiment, the number of pixels included in the different points is the same, that is, the different points have the same size, which is beneficial to the image recognition and the image processing, and further beneficial to the recognition and the processing of the network behavior.
Optionally, in an embodiment, the points are uniformly arranged on the picture, that is, the distances between adjacent points are equal.
Optionally, in an embodiment, the preset template image is a picture with n × n pixels, where n ═ 2k-1 × m, a dot is a picture including m pixels, adjacent dots are spaced by m pixels, RGB values of the pixels of the same dot are the same, and a morphological attribute of the dot is an RGB value of the pixel of the dot, and the step of mapping the behavior attribute to the morphological attribute of the dot includes: determining the number i of the behavior attribute; determining the RGB value range of the pixel, wherein the R value range of the pixel is 0-R, the G value range of the pixel is 0-G, and the B value range of the pixel is 0-B; calculating the hexadecimal number corresponding to the serial number i by adopting the following formula
Figure BDA0002351816620000171
Wherein the content of the first and second substances,
Figure BDA0002351816620000172
represents rounding down, c represents the number of categories of behavior attributes, HEX () represents the conversion of a binary number into a hexadecimal number; and converting every two digits of the hexadecimal number x into a decimal number to obtain the RGB values of the pixels of the dot.
Specifically, the behavior attributes of different types in all the behavior data are numbered, and the RGB value range of the pixel is determined, where the RGB value range is the range of RGB values that can be displayed on the preset image template, and then r × g × b is the number of types of colors that can be displayed on the preset image templateRemoving a background color of the preset image template, wherein r g b-1 is the number of color types of displayable points on the preset image template, namely the number of types of the behavior attributes which can be represented on the preset image template, and c is the number of types of the actual behavior attributes, and the background color is obtained by a formula
Figure BDA0002351816620000173
And (3) obtaining a hexadecimal number corresponding to the behavior attribute with the serial number i, converting every two digits in the hexadecimal number into decimal numbers, thus obtaining the RGB value of the pixel and realizing the mapping of the behavior attribute to the form attribute of the point.
Optionally, in an embodiment, the morphological property of the line is an RGB value of a pixel of the line, and the step of mapping the number to the morphological property of the line includes: converting the number from a decimal number to a hexadecimal number; each two digits of the hexadecimal number are converted into decimal numbers to obtain the RGB values of the pixels of the line.
Specifically, when the number of associated dots (the number of the line data units) is mapped to the line shape attribute, the number is mapped to the RGB value corresponding to the line, that is, the line color, and the shape attribute of the dot and the shape attribute of the line are both the RGB values of the pixel.
Example two
The second embodiment of the present invention provides a method for detecting a preferred network behavior, and the same technical features and technical effects as those in the first embodiment can be mutually referred to. In this embodiment, the method for detecting a network behavior is implemented by an abnormal behavior detection system, and specifically, fig. 2 is a flowchart of a work flow of the abnormal behavior detection system according to a second embodiment of the present invention, as shown in fig. 2, the abnormal behavior detection system according to this embodiment includes a data processing module, a behavior representation module, a feature extractor training module, a feature extraction module, a normal behavior training module, a normal behavior prediction module, and an abnormal behavior prediction module.
The input of the data processing module is raw network data, which may be raw network traffic or log data, and all data including network behavior from a source (initiator) to a destination (receiver) are within the technical scope of the present invention. The source is the initiator of the network behavior, for example: source IP, server account, employee number, mailbox, etc. The destination is the recipient of the action, for example: destination IP, host name, database, service system, mailbox, etc. Network behavior includes, but is not limited to, actions involving data interactions such as connecting, logging on, querying, writing, mailing, and the like.
When the data processing module processes the original network data, the data processing module executes the step of acquiring the behavior data in the detection method of the network behavior, and the main steps comprise data extraction, statistics and standardized processing. Firstly, extracting a behavior sequence of each source party from original network data, wherein the behavior sequence is also called a behavior data group, and behavior data in the behavior data group can include: date, time, source, destination, and may include additional behavioral attributes such as login, protocol, port, etc. The behavior attribute may be one kind or plural kinds. In this embodiment, taking an example of a network behavior of a TCP connection, a source direction is a source IP, a destination party is a destination IP, and a behavior attribute is a connection protocol, and different destination parties and behavior attributes may be defined according to requirements in an actual scene. Taking a certain source (e.g., client IP10.70.1.11) as an example, the extracted data and the generated sequence are shown in table 1:
table 1 data extraction examples
Figure BDA0002351816620000181
Figure BDA0002351816620000191
With regard to table 1 above, it means that the client with IP10.70.1.11 accesses destination IP10.11.11.5 in the HTTP manner at 2019-01-0100: 08:00, accesses destination IP10.11.11.6 … … in the SSH manner at 2019-01-0100: 09:30, and so on.
Subsequently, the data in table 1 were counted and merged. And for each two adjacent network behaviors, according to the time sequence, marking the destination party of the earlier time as the previous destination party, marking the destination party of the later time as the next destination party, and recording respective behavior attributes. The number of occurrences of the quadruple (previous destination, next destination, previous attribute, next attribute) within the observation time (e.g. 1 day) should also be noted. Note that, for convenience of expression and practical effects, the minimum unit of the observation time range is 1 day hereinafter, but in practical use, this value may be greater or less than 1 day.
The output of the data processing module is processed data, and the format is shown in table 2:
table 2 data statistics and processed data samples
Figure BDA0002351816620000192
The input of the behavior representation module is the output result of the data processing module, as in the example shown in table 2 above. When the behavior representation module performs behavior representation, the method specifically comprises the following steps:
a picture (i.e., a preset template image) with n × n (n is (2k-1) × 8, k is a positive integer) is defined, and the color range is RGB (255, 255, 255). The destination is represented by a dot with a diameter of 8 pixels, and this method can represent k × k destinations, as shown in fig. 3. The corresponding relationship between each dot and the destination can be set in different ways, such as: sequential, random or farthest distance or other distribution means obtained by mathematical calculation, wherein sequential distribution includes distribution in a left-to-right or top-to-bottom order, or distribution in a serpentine order, or distribution in a zigzag order, or the like. As shown in fig. 3, the behavior attribute is represented by the color of a dot, because the available R, G, B value can represent 256 × 256-1 × 16777215 different attributes at most, except that the picture background color RGB (255, 255, 255). Numbering the behavior attributes, and mapping the colors of the points to the behavior attributes by taking
Figure BDA0002351816620000201
Represents rounding down, c represents the number of categories of the behavior attribute, i represents the number of the current behavior attribute, HEX () represents the conversion of a binary number into tenA six-system number. And taking hexadecimal number of every two digits of x to convert the hexadecimal number into decimal number, and correspondingly obtaining R, G, B value. Examples of the behavior attribute numbering and color assignment are shown in table 3 (assuming that the number of types of the attribute is 50, that is, the samples have 50 different protocols in total).
Table 3 numbering attributes and assigning color examples
Figure BDA0002351816620000202
The jump relationship (i.e. the association relationship in the above) between the two destinations is represented by a connecting line between two dots (i.e. association points), and the RGB value of the line represents the number of occurrences (i.e. the number of association points). The decimal number of occurrences is converted into a hexadecimal number, and each two digits of the hexadecimal number are converted into decimal values which are R, G, B respectively. The mapping relationship between the number of times and the color is shown in table 4. If the number of times is greater than 16777214, all the times are represented by RGB (255, 255, 254).
TABLE 4 mapping relationship between number of occurrences and line color
Number of occurrences Hexadecimal of degree Line color (RGB)
1 1 (0,0,1)
2 2 (0,0,2)
3 3 (0,0,3)
……
10000 2710 (0,39,16)
10001 2711 (0,39,17)
10002 2712 (0,39,18)
……
16777214 FFFFFE (255,255,254)
According to the above method steps, the behavior of each source (e.g. IP or human) at each observation time range (e.g. 1 day) can be represented by a picture. Still taking the behavior data of table 1 and table 2 as an example, the converted effect graph is shown in fig. 4, and it should be noted that the IP information such as 10.11.11.5 and the grid lines in fig. 4 are only for convenience of description, and the IP information and the grid lines themselves are not included in the actual graph.
The behavior of each source (e.g. IP or person) at multiple observation time ranges (e.g. multiple days) can be represented as a set of multiple pictures, and the specific effect graph is shown in fig. 5.
After the behavior image data is obtained by the above method, as shown in fig. 2, the behavior image data is obtained as a training behavior image, and the initial self-encoder is trained by the feature extractor training module, optionally, the initial self-encoder is a CNN self-encoder. The input image and the prediction target of the CNN self-encoder are the same picture, the middle convolutional layer is used as an encoding layer, which is equivalent to performing automatic compression encoding, the compression result is subjected to deconvolution or upsampling (which is equivalent to an inverse encoding layer), the output layer is finally restored to the dimension with the same size as the input picture of the input layer, after the CNN self-encoder is trained, the target self-encoder is obtained, the behavior image data is obtained to be used as a verification behavior image for testing, if the output layer result is very similar to the input layer image, the target self-encoder is considered to have an obvious effect, the middle compression encoding layer can be used for compression encoding representation of the input image, and the feature extractor capable of extracting the feature vector is obtained.
As described above, after the target self-encoder is trained using a large amount of behavior image data, a certain behavior image is input to the target self-encoder, and after the calculation of the input layer and the encoding layer, the compressed encoding result output by the encoding layer is taken out as the extracted feature vector, where the compressed encoding result is defined as a column vector (L × 1 dimensional matrix) with a length of L, that is, a column vector with a feature vector of L × 1 extracted by the target self-encoder.
Before the network behavior of the target subject needs to be detected, the network data of the target subject is obtained, after the behavior image data is obtained through the data processing module and the behavior representation module, the feature vector corresponding to each behavior image is obtained through the target self-encoder, a behavior prediction model is trained, specifically, column vectors with the length L in a window are spliced in a sliding window mode, and the size of the sliding window is T. The T +1 th L column vector is predicted each time taking a matrix of size L x T. As shown in fig. 6, the feature vectors of days 1 to T (in this embodiment, the duration of the unit time window is 1 day) are concatenated with the goal of predicting the feature vector of day T +1 as the 1 st training sample; the feature vectors from day 2 to day T +1 are concatenated with the goal of predicting the feature vector for day T +2 (the vector with padding in fig. 6), as the 2 nd training sample … …, and so on, accumulating enough samples for training the normal behavior model. Optionally, the model is trained by using deep learning algorithms such as RNN/LSTM, and after the model is trained sufficiently, the model has the capability of memorizing the normal behavior for T days and predicting the behavior for T +1 day, and the normal behavior training module outputs the model with the capability, i.e., the normal behavior model, i.e., the behavior prediction model.
After the training of the normal behavior model is finished, the method can be applied to the detection of the actual network behavior, and the time of the data to be predicted is usually later than that of the data used for training. As shown in FIG. 7, in the normal behavior prediction process, the corresponding feature of the real behavior every day is marked as XiUsing day X before TiThe predicted day T +1 result was noted as Y. For example, with (X)-T+1,X-T+2,…,X-1,X0) Predicted to obtain Y1By (X)-T+2,X-T+3,…,X0,X1) Predicted to obtain Y2By analogy, each prediction result Y is output by the normal behavior prediction modulei
In the abnormal behavior prediction module, the behavior prediction result (Y) is predicted by comparing1,Y2,…,Ym) With true data (X)1,X2,…,Xm) To detect whether abnormal behavior occurs on a certain day. Where m is the tolerable number of anomalous observations. If the abnormal condition is detected on the same day, the alarm is required, then m is 1, and whether the alarm is required or not is determined according to the output abnormal degree; if the abnormality is detected on the same day, but the abnormal data is not alarmed immediately considering that the abnormal data is possibly fluctuated due to normal data or the subsequent behaviors are examined and decided due to the service requirement>And 1, determining whether to alarm or not according to the calculation result of the abnormality degree of the plurality of days.
Wherein, the degree of behavioral abnormality is calculated as follows: degree of behavioral abnormality on day i by calculating XiAnd YiThe root mean square error of (d) yields:
Figure BDA0002351816620000221
wherein, Xi=(Xi1,Xi2,…,Xij,…,XiL),Yi=(Yi1,Yi2,…,Yij,…,YiL)。
When m is 1, if SiAnd if the threshold value is larger than the preset threshold value, alarming. When m is>At 1 hour, if r days S exist within m daysiAnd if the value is larger than a preset threshold value (r is less than or equal to m), alarming.
EXAMPLE III
Corresponding to the first embodiment, a third embodiment of the present invention provides a device for detecting a network behavior, and reference is made to the above for related technical features and technical effects, which are not described herein again. Fig. 8 is a block diagram of a network behavior detection apparatus according to a third embodiment of the present invention, and as shown in fig. 8, the apparatus includes: a first obtaining module 301, a second obtaining module 302, a mapping module 303, an extraction module 304, a prediction module 305, and a comparison module 306.
The first obtaining module 301 is configured to obtain behavior data of a target subject network behavior in a unit time window to be detected, so as to obtain a behavior data set to be detected; the second obtaining module 302 is configured to obtain behavior data of a network behavior of a target subject in a historical unit time window to obtain a historical behavior data set; the mapping module 303 is configured to map the behavior data set to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and map the historical behavior data set into the behavior image according to the mapping rule to obtain a historical image; the extraction module 304 is configured to extract a feature vector of an image to be detected to obtain a feature vector to be detected, and extract a feature vector of a historical image to obtain a historical feature vector; the prediction module 305 is configured to input a historical feature vector group to a preset behavior prediction model to obtain a predicted feature vector, where the historical feature vector group is a historical feature vector corresponding to a plurality of continuous historical unit time windows; the comparison module 306 is configured to compare the to-be-detected feature vector with the predicted feature vector to determine whether the target subject network behavior in the unit time window to be detected is abnormal.
Optionally, in an embodiment, when the extracting module 304 extracts the feature vector of the behavior image, the specifically performed steps include: establishing an initial self-encoder, wherein the initial self-encoder comprises an input layer, an encoding layer, an anti-encoding layer and an output layer; acquiring a training behavior image, respectively taking the training behavior image as the input of an input layer and the output of an output layer of an initial self-encoder, and training the initial self-encoder to obtain a target self-encoder; acquiring a verification behavior image, and taking the verification behavior image as the input of an input layer of a target self-encoder to obtain the output of an output layer of the target self-encoder; comparing the verification behavior image with the output of the output layer of the target self-encoder to judge whether the target self-encoder meets the requirements or not; and when the target self-encoder meets the requirements, taking the behavior image as the input of the input layer of the target self-encoder, and acquiring the output of the coding layer of the target self-encoder to obtain the characteristic vector of the behavior image.
Optionally, in an embodiment, the apparatus for detecting network behavior further includes: the training module is used for executing the following steps before the steps of inputting the historical characteristic vectors corresponding to a plurality of continuous historical unit time windows into a preset behavior prediction model to obtain the predicted characteristic vectors: acquiring network data of a target subject; extracting a behavior data group in the network data, wherein the behavior data group comprises behavior data of network behaviors of a target subject in a unit time window; mapping the behavior data set into a behavior image according to a mapping rule to obtain a training behavior image; extracting a feature vector of the training behavior image to obtain a training feature vector, wherein the training feature vector is a matrix of L x 1; sequencing a plurality of training feature vectors corresponding to the network data according to the time sequence to obtain a training feature vector queue; adopting a sliding window mode to carry out value taking and splicing on the training feature vector queue to obtain a plurality of training samples, wherein the training samples comprise spliced vectors and prediction vectors, the spliced vectors comprise L-T matrixes formed by T training feature vectors in the sliding window, and the prediction vectors comprise training feature vectors adjacent to and behind the sliding window; establishing an initial behavior prediction model, wherein the initial behavior prediction model comprises an input layer, an intermediate layer and an output layer; and taking the spliced vector in the training sample as the input of the input layer of the initial behavior prediction model, taking the prediction vector in the training sample as the output of the output layer of the initial behavior prediction model, and training the initial behavior prediction model to obtain the behavior prediction model.
Optionally, in an embodiment, when the comparison module 306 compares the to-be-detected feature vector with the predicted feature vector to determine whether the target subject network behavior in the to-be-detected unit time window is abnormal, the specific steps executed include: calculating the root mean square error of the feature vector to be detected and the predicted feature vector; judging whether the root mean square error is larger than a preset error threshold value; and when the root mean square error is larger than a preset error threshold, determining that the network behavior of the target subject in the unit time window to be detected is abnormal.
Optionally, in an embodiment, the apparatus for detecting network behavior further includes: and the alarm module is used for generating an alarm when the network behavior of the target main body in r units to be detected is abnormal in s continuous unit time windows to be detected, wherein r is less than or equal to s.
Optionally, in an embodiment, the behavior data includes a behavior peer, a behavior attribute, and a behavior time, the behavior attribute is an attribute of a network behavior generated between the target subject and the behavior peer, and when the mapping module 303 maps the behavior data set to an image according to a preset mapping rule, the specifically executed step includes: for each piece of behavior data, mapping a behavior opposite end into a position coordinate of a point in a preset template image, mapping a behavior attribute into a morphological attribute of the point, and displaying the point of the morphological attribute at the position coordinate to obtain an image point corresponding to the behavior data; and arranging a connecting line between the association points, wherein the image points corresponding to the two pieces of behavior data with the behavior time satisfying the preset association relationship are the association points.
Optionally, in an embodiment, the association point is an image point corresponding to two behavior data whose behaviors are adjacent in time; the connecting line is a vector, and the direction of the vector represents the sequence of the behavior time corresponding to the associated point.
Optionally, in an embodiment, when the mapping module 303 maps the behavior data set into an image according to a preset mapping rule, the specifically executed steps further include: counting the number of the same associated points corresponding to the behavior data group; and mapping the number to the form attribute of the line, wherein when the connecting line is arranged between the associated points, the connecting line is arranged between the associated points according to the form attribute of the line.
Optionally, in an embodiment, when the mapping module 303 maps the behavior peer to the position coordinate of the midpoint in the preset template image, the specifically executed steps include: presetting position coordinates of a plurality of points in a preset template image; allocating an end identifier to the position coordinates of a plurality of preset points through sequential allocation, random allocation or farthest distance allocation, and establishing a one-to-one corresponding first mapping relation between the position coordinates and the end identifier; and determining the position coordinate corresponding to the behavior opposite terminal according to the terminal identifier of the behavior opposite terminal and the first mapping relation.
Optionally, in an embodiment, the preset template image is a picture with n × n pixels, where n ═ 2k-1 × m, a dot is a picture including m pixels, m pixels are spaced between adjacent dots, RGB values of the pixels of the same dot are the same, and a morphological attribute of the dot is an RGB value of the pixel of the dot, and the mapping module 303 specifically performs the following steps when mapping the behavior attribute as the morphological attribute of the dot: determining the number i of the behavior attribute; determining the RGB value range of the pixel, wherein the R value range of the pixel is 0-R, the G value range of the pixel is 0-G, and the B value range of the pixel is 0-B; calculating the hexadecimal number corresponding to the serial number i by adopting the following formula
Figure BDA0002351816620000251
Wherein the content of the first and second substances,
Figure BDA0002351816620000252
represents rounding down, c represents the number of categories of behavior attributes, HEX () represents the conversion of a binary number into a hexadecimal number; and converting every two digits of hexadecimal number x intoThe decimal number yields the RGB values of the pixels of the dot.
Optionally, in an embodiment, the shape attribute of the line is an RGB value of a pixel of the line, and the mapping module 303 specifically performs the following steps when mapping the number to the shape attribute of the line: converting the number from a decimal number to a hexadecimal number; each two digits of the hexadecimal number are converted into decimal numbers to obtain the RGB values of the pixels of the line.
Example four
The fourth embodiment further provides a computer device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server or a rack server (including an independent server or a server cluster composed of a plurality of servers) capable of executing programs, and the like. As shown in fig. 9, the computer device 01 of the present embodiment at least includes but is not limited to: a memory 011 and a processor 012, which are communicatively connected to each other via a system bus, as shown in fig. 9. It is noted that fig. 9 only shows the computer device 01 having the component memory 011 and the processor 012, but it is to be understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead.
In this embodiment, the memory 011 (i.e., a readable storage medium) includes a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the storage 011 can be an internal storage unit of the computer device 01, such as a hard disk or a memory of the computer device 01. In other embodiments, the memory 011 can also be an external storage device of the computer device 01, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a flash Card (FlashCard), etc. provided on the computer device 01. Of course, the memory 011 can also include both internal and external memory units of the computer device 01. In this embodiment, the memory 011 is generally used for storing an operating system installed in the computer device 01 and various application software, such as program codes of the network behavior detection apparatus in the third embodiment. Further, the memory 011 can also be used to temporarily store various kinds of data that have been output or are to be output.
The processor 012 may be a Central Processing Unit (CPU), a controller, a microcontroller, a microprocessor, or other data Processing chip in some embodiments. The processor 012 is generally used to control the overall operation of the computer device 01. In this embodiment, the processor 012 is configured to run a program code stored in the memory 011 or process data, such as a method of detecting a network behavior.
EXAMPLE five
The fifth embodiment further provides a computer-readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application store, etc., on which a computer program is stored, which when executed by a processor implements corresponding functions. The computer-readable storage medium of this embodiment is used for storing a network behavior detection apparatus, and when executed by a processor, implements the network behavior detection method of the first embodiment.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for detecting network behavior, comprising:
acquiring behavior data of the network behavior of a target main body in a unit time window to be detected to obtain a behavior data group to be detected;
acquiring behavior data of the network behavior of the target subject in a historical unit time window to obtain a historical behavior data set;
mapping the behavior data group to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and mapping the historical behavior data group into a behavior image according to the mapping rule to obtain a historical image;
extracting the characteristic vector of the image to be detected to obtain a characteristic vector to be detected, and extracting the characteristic vector of the historical image to obtain a historical characteristic vector;
inputting a historical feature vector group into a preset behavior prediction model to obtain a prediction feature vector, wherein the historical feature vector group is a plurality of continuous historical feature vectors corresponding to the historical unit time window;
and comparing the to-be-detected feature vector with the predicted feature vector to determine whether the target subject network behavior is abnormal in the to-be-detected unit time window.
2. The method for detecting network behavior according to claim 1, wherein the step of extracting the feature vector of the behavior image comprises:
establishing an initial self-encoder, wherein the initial self-encoder comprises an input layer, an encoding layer, an anti-encoding layer and an output layer;
acquiring a training behavior image, respectively taking the training behavior image as the input layer and the output layer of the initial self-encoder, and training the initial self-encoder to obtain a target self-encoder;
acquiring a verification behavior image, and taking the verification behavior image as the input of an input layer of the target self-encoder to obtain the output of an output layer of the target self-encoder;
comparing the verification behavior image with the output of an output layer of the target self-encoder to judge whether the target self-encoder meets the requirement;
and when the target self-encoder meets the requirements, taking the behavior image as the input of the input layer of the target self-encoder, and acquiring the output of the coding layer of the target self-encoder to obtain the characteristic vector of the behavior image.
3. The method according to claim 2, wherein before the step of inputting the historical feature vector set to the preset behavior prediction model to obtain the predicted feature vector, the method further comprises:
acquiring network data of the target subject;
extracting a behavior data group in the network data, wherein the behavior data group comprises the behavior data of the network behavior of the target subject in a unit time window;
mapping the behavior data set into a behavior image according to the mapping rule to obtain a training behavior image;
extracting a feature vector of the training behavior image to obtain a training feature vector, wherein the training feature vector is a matrix of L x 1;
sequencing a plurality of training feature vectors corresponding to the network data according to a time sequence to obtain a training feature vector queue;
adopting a sliding window mode to carry out value taking and splicing on the training feature vector queue to obtain a plurality of training samples, wherein the training samples comprise splicing vectors and prediction vectors, the splicing vectors comprise L-T matrixes formed by T training feature vectors in the sliding window, and the prediction vectors comprise the training feature vectors adjacent to and behind the sliding window;
establishing an initial behavior prediction model, wherein the initial behavior prediction model comprises an input layer, an intermediate layer and an output layer;
and taking the splicing vector in the training sample as the input of an input layer of the initial behavior prediction model, taking the prediction vector in the training sample as the output of an output layer of the initial behavior prediction model, and training the initial behavior prediction model to obtain the behavior prediction model.
4. The method for detecting network behavior according to claim 1, wherein the step of comparing the to-be-detected feature vector with the predicted feature vector to determine whether the target subject network behavior is abnormal within the to-be-detected unit time window comprises:
calculating the root mean square error of the feature vector to be detected and the predicted feature vector;
judging whether the root mean square error is larger than a preset error threshold value;
when the root mean square error is larger than the preset error threshold, determining that the target subject network behavior is abnormal in the unit time window to be detected; and
and generating an alarm when the target main body network behavior is abnormal in the unit time windows to be detected, wherein r is less than or equal to s.
5. The method according to claim 1, wherein the behavior data includes a behavior peer, a behavior attribute, and a behavior time, the behavior attribute is an attribute of the network behavior generated between the target subject and the behavior peer, and the step of mapping the behavior data set to an image according to a preset mapping rule includes:
for each piece of behavior data, mapping the behavior opposite end to a position coordinate of a point in a preset template image, mapping the behavior attribute to a form attribute of the point, and displaying the point of the form attribute at the position coordinate to obtain an image point corresponding to the behavior data;
and setting a connecting line between the association points, wherein the image points corresponding to the two pieces of behavior data with the behavior time satisfying a preset association relationship are the association points.
6. The method for detecting network behavior according to claim 5,
the associated points are the image points corresponding to two adjacent behavior data of the behavior time, the connecting line is a vector, and the direction of the vector represents the sequence of the behavior time corresponding to the associated points;
the step of mapping the behavior opposite end to the position coordinate of the midpoint of the preset template image comprises the following steps: presetting position coordinates of a plurality of points in the preset template image; allocating an end identifier to the position coordinates of the preset points through sequential allocation, random allocation or farthest distance allocation, and establishing a one-to-one corresponding first mapping relation between the position coordinates and the end identifier; determining the position coordinate corresponding to the behavior opposite terminal according to the terminal identification of the behavior opposite terminal and the first mapping relation;
the step of mapping the behavior data set into an image according to a preset mapping rule further comprises: counting the number of the same associated points corresponding to the behavior data group; mapping the number to a morphological attribute of the line; the step of arranging connecting lines between the associated points comprises: setting the connecting lines between the associated points according to the form attributes of the lines;
the morphological property of the line is an RGB value of the pixels of the line, and the step of mapping the number to the morphological property of the line comprises: converting the number from a decimal number to a hexadecimal number; and converting each two bits of the hexadecimal number into a decimal number to obtain the RGB value of the pixels of the line.
7. The method according to claim 5, wherein the preset template image is a picture with n × n pixels, where n ═ 2k-1 × m, the dots include m pixels, m pixels are spaced between adjacent dots, RGB values of the pixels of a same dot are the same, and the morphological attribute of a dot is an RGB value of a pixel of a dot, and the step of mapping the behavior attribute to the morphological attribute of a dot includes:
determining the number i of the behavior attribute;
determining the RGB value range of the pixel, wherein the R value range of the pixel is 0-R, the G value range of the pixel is 0-G, and the B value range of the pixel is 0-B;
calculating the hexadecimal number corresponding to the number i by adopting the following formula
Figure FDA0002351816610000041
Wherein the content of the first and second substances,
Figure FDA0002351816610000042
represents rounding down, c represents the number of categories of the behavior attribute, and HEX () represents the conversion of a binary number into a hexadecimal number; and
and converting every two digits of the hexadecimal number x into decimal numbers to obtain the RGB values of the pixels of the points.
8. An apparatus for detecting network behavior, comprising:
the first acquisition module is used for acquiring behavior data of the network behavior of the target main body in the unit time window to be detected to obtain a behavior data group to be detected;
the second acquisition module is used for acquiring the behavior data of the network behavior of the target subject in a historical unit time window to obtain a historical behavior data set;
the mapping module is used for mapping the behavior data group to be detected into a behavior image according to a preset mapping rule to obtain an image to be detected, and mapping the historical behavior data group into a behavior image according to the mapping rule to obtain a historical image;
the extraction module is used for extracting the characteristic vector of the image to be detected to obtain the characteristic vector to be detected, and extracting the characteristic vector of the historical image to obtain the historical characteristic vector;
the prediction module is used for inputting a historical feature vector group into a preset behavior prediction model to obtain a prediction feature vector, wherein the historical feature vector group is a plurality of continuous historical feature vectors corresponding to the historical unit time window;
and the comparison module is used for comparing the to-be-detected feature vector with the predicted feature vector to determine whether the target subject network behavior is abnormal in the to-be-detected unit time window.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 7 are implemented by the processor when executing the computer program.
10. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program when executed by a processor implements the steps of the method of any one of claims 1 to 7.
CN201911418725.1A 2019-12-31 2019-12-31 Network behavior detection method and device, computer equipment and storage medium Active CN111131314B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911418725.1A CN111131314B (en) 2019-12-31 2019-12-31 Network behavior detection method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911418725.1A CN111131314B (en) 2019-12-31 2019-12-31 Network behavior detection method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111131314A true CN111131314A (en) 2020-05-08
CN111131314B CN111131314B (en) 2022-04-12

Family

ID=70507110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911418725.1A Active CN111131314B (en) 2019-12-31 2019-12-31 Network behavior detection method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111131314B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112073396A (en) * 2020-08-27 2020-12-11 北京天融信网络安全技术有限公司 Method and device for detecting transverse movement attack behavior of intranet
CN112839059A (en) * 2021-02-22 2021-05-25 北京六方云信息技术有限公司 WEB intrusion detection processing method and device and electronic equipment
CN113568819A (en) * 2021-01-31 2021-10-29 腾讯科技(深圳)有限公司 Abnormal data detection method and device, computer readable medium and electronic equipment
CN113627754A (en) * 2021-07-27 2021-11-09 北京达佳互联信息技术有限公司 Operation control method and device for index detection, electronic equipment and storage medium
CN114612887A (en) * 2021-09-01 2022-06-10 腾讯科技(深圳)有限公司 Bill abnormity detection method, device, equipment and computer readable storage medium
CN115604016A (en) * 2022-10-31 2023-01-13 北京安帝科技有限公司(Cn) Industrial control abnormal behavior monitoring method and system of behavior characteristic chain model
TWI824261B (en) * 2020-09-29 2023-12-01 日商樂天集團股份有限公司 Abnormality determination system, abnormality determination method and information storage media

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10015182B1 (en) * 2016-06-30 2018-07-03 Symantec Corporation Systems and methods for protecting computing resources
CN108965055A (en) * 2018-07-17 2018-12-07 成都力鸣信息技术有限公司 A kind of network flow abnormal detecting method taking a method based on historical time
CN108985361A (en) * 2018-07-02 2018-12-11 北京金睛云华科技有限公司 A kind of malicious traffic stream detection implementation method and device based on deep learning
CN110138763A (en) * 2019-05-09 2019-08-16 中国科学院信息工程研究所 A kind of inside threat detection system and method based on dynamic web browsing behavior
CN110336838A (en) * 2019-08-07 2019-10-15 腾讯科技(武汉)有限公司 Account method for detecting abnormality, device, terminal and storage medium
CN110581856A (en) * 2019-09-17 2019-12-17 武汉思普崚技术有限公司 malicious code detection method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10015182B1 (en) * 2016-06-30 2018-07-03 Symantec Corporation Systems and methods for protecting computing resources
CN108985361A (en) * 2018-07-02 2018-12-11 北京金睛云华科技有限公司 A kind of malicious traffic stream detection implementation method and device based on deep learning
CN108965055A (en) * 2018-07-17 2018-12-07 成都力鸣信息技术有限公司 A kind of network flow abnormal detecting method taking a method based on historical time
CN110138763A (en) * 2019-05-09 2019-08-16 中国科学院信息工程研究所 A kind of inside threat detection system and method based on dynamic web browsing behavior
CN110336838A (en) * 2019-08-07 2019-10-15 腾讯科技(武汉)有限公司 Account method for detecting abnormality, device, terminal and storage medium
CN110581856A (en) * 2019-09-17 2019-12-17 武汉思普崚技术有限公司 malicious code detection method and system

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112073396A (en) * 2020-08-27 2020-12-11 北京天融信网络安全技术有限公司 Method and device for detecting transverse movement attack behavior of intranet
TWI824261B (en) * 2020-09-29 2023-12-01 日商樂天集團股份有限公司 Abnormality determination system, abnormality determination method and information storage media
CN113568819A (en) * 2021-01-31 2021-10-29 腾讯科技(深圳)有限公司 Abnormal data detection method and device, computer readable medium and electronic equipment
CN113568819B (en) * 2021-01-31 2024-04-16 腾讯科技(深圳)有限公司 Abnormal data detection method, device, computer readable medium and electronic equipment
CN112839059A (en) * 2021-02-22 2021-05-25 北京六方云信息技术有限公司 WEB intrusion detection processing method and device and electronic equipment
CN113627754A (en) * 2021-07-27 2021-11-09 北京达佳互联信息技术有限公司 Operation control method and device for index detection, electronic equipment and storage medium
CN114612887A (en) * 2021-09-01 2022-06-10 腾讯科技(深圳)有限公司 Bill abnormity detection method, device, equipment and computer readable storage medium
CN115604016A (en) * 2022-10-31 2023-01-13 北京安帝科技有限公司(Cn) Industrial control abnormal behavior monitoring method and system of behavior characteristic chain model

Also Published As

Publication number Publication date
CN111131314B (en) 2022-04-12

Similar Documents

Publication Publication Date Title
CN111131314B (en) Network behavior detection method and device, computer equipment and storage medium
CN111107107B (en) Network behavior detection method and device, computer equipment and storage medium
EP3396625A1 (en) Image tampering detection method and system, electronic apparatus and storage medium
CN111783875A (en) Abnormal user detection method, device, equipment and medium based on cluster analysis
CN111475797A (en) Method, device and equipment for generating confrontation image and readable storage medium
CN109376689B (en) Crowd analysis method and device
CN111177469A (en) Face retrieval method and face retrieval device
CN113919513A (en) Method and device for aggregating security of federated learning and electronic equipment
CN113127864B (en) Feature code extraction method, device, computer equipment and readable storage medium
CN110955891A (en) File detection method, device and system and data processing method
CN112487929A (en) Image recognition method, device and equipment of children picture book and storage medium
CN111131322B (en) Network behavior detection method and device, computer equipment and storage medium
CN112995155B (en) Financial abnormal message identification method and device
CN111814776B (en) Image processing method, device, server and storage medium
CN111062448B (en) Equipment type recognition model training method, equipment type recognition method and device
CN117391585A (en) Warehouse information management method and system of industrial Internet
CN115934484B (en) Diffusion model data enhancement-based anomaly detection method, storage medium and apparatus
CN111091194A (en) Operation system identification method based on CAVWB _ KL algorithm
CN116232696A (en) Encryption traffic classification method based on deep neural network
CN115797291A (en) Circuit terminal identification method and device, computer equipment and storage medium
CN114998665A (en) Image category identification method and device, electronic equipment and storage medium
CN115019057A (en) Image feature extraction model determining method and device and image identification method and device
CN111797922B (en) Text image classification method and device
CN111159486B (en) Processing method and device of network data, computer equipment and storage medium
CN113127863A (en) Malicious code detection method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Room 332, 3 / F, Building 102, 28 xinjiekouwei street, Xicheng District, Beijing 100088

Patentee after: QAX Technology Group Inc.

Patentee after: Qianxin Wangshen information technology (Beijing) Co.,Ltd.

Address before: Room 332, 3 / F, Building 102, 28 xinjiekouwei street, Xicheng District, Beijing 100088

Patentee before: QAX Technology Group Inc.

Patentee before: LEGENDSEC INFORMATION TECHNOLOGY (BEIJING) Inc.

CP01 Change in the name or title of a patent holder