WO2016208159A1 - 情報処理装置、情報処理システム、情報処理方法、及び、記憶媒体 - Google Patents
情報処理装置、情報処理システム、情報処理方法、及び、記憶媒体 Download PDFInfo
- Publication number
- WO2016208159A1 WO2016208159A1 PCT/JP2016/002898 JP2016002898W WO2016208159A1 WO 2016208159 A1 WO2016208159 A1 WO 2016208159A1 JP 2016002898 W JP2016002898 W JP 2016002898W WO 2016208159 A1 WO2016208159 A1 WO 2016208159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- alert
- alert information
- dissimilarity
- information processing
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1416—Event detection, e.g. attack signature detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/32—Monitoring with visual or acoustical indication of the functioning of the machine
- G06F11/324—Display of status information
- G06F11/327—Alarm or error message display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/10—Machine learning using kernel methods, e.g. support vector machines [SVM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
- G06N20/20—Ensemble learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/02—Network architectures or network communication protocols for network security for separating internal from external traffic, e.g. firewalls
- H04L63/0227—Filtering policies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1425—Traffic logging, e.g. anomaly detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/81—Threshold
Definitions
- the present invention relates to information communication, and more particularly, to an information processing apparatus, an information processing system, an information processing method, and a storage medium that monitor communication.
- a device that monitors a network that communicates information uses a firewall or IDS (Intrusion Detection System) to monitor the communication and, if necessary, to prevent attacks or unauthorized intrusions on the monitored network. Block communication. In recent years, targeted attacks aimed at stealing intellectual property or confidential information have increased. Therefore, the demand for cybersecurity for networks is increasing. As a technique for realizing cyber security, a technique for monitoring a network and responding to an incident using an SOC (Security Operation Center) has become common.
- IDS Intrusion Detection System
- a device that monitors a network classifies alerts detected by a monitoring device that implements a function such as a firewall or IDS based on the degree of risk (see, for example, Non-Patent Document 1).
- An alert is a message that calls attention.
- the alert is a message notifying that an abnormality in the network has been detected.
- the alert that the monitoring device creates based on the detection includes an alert that does not need to be reported as an incident.
- the alert detected by the monitoring device includes an alert with low risk or an alert that is a false detection.
- an operator who performs network monitoring work classifies alerts by referring to the following information, that is, information included in detected alerts or external information. Set the risk level. And the apparatus which monitors a network reports the detected alert as an alert corresponding to an incident as needed based on the risk set by the operator.
- information to be referred to in the above classification includes, for example, a detection rule that detects an alert, an IP (Internet protocol) address and port number of a transmission source host, and an IP address and port number of a transmission destination host.
- a detection rule that detects an alert for example, an IP (Internet protocol) address and port number of a transmission source host, and an IP address and port number of a transmission destination host.
- IP Internet protocol
- information to be referred to in the above classification for example, the importance assigned to the detection rule used by the security vendor to detect the alert, and the communication included in the communication (for example, packet) that caused the detection There is information.
- the system described in Non-Patent Document 1 classifies alerts. More specifically, the system described in Non-Patent Document 1 uses machine learning using alert information and aggregated information obtained from alert information for classification. In the agent mode, the system described in Non-Patent Document 1 processes an alert having a high reliability of the classification result based on machine learning without presenting it to the operator. The system described in Non-Patent Document 1 improves the efficiency of operator processing based on such operations.
- Non-Patent Document 1 In the system described in Non-Patent Document 1, there is a risk of false negative in which an alert that is a positive detection is determined to be a false detection alert. Therefore, the system described in Non-Patent Document 1 randomly extracts some alerts from the alerts determined not to be presented to the operator based on the classification result using machine learning, and presents them to the operator. do. Based on such an operation, the system described in Non-Patent Document 1 suppresses false negatives in classification.
- Tadeusz Pietraszek "Using Adaptive Alert Classification to Reduce False Positives in Intrusion Detection '', Recent Advances in Intrusion Detection (Subtitle: 7th international Symposium, RAID 2004, Sophia 102 epte, , 2004
- Alerts that are highly similar to alerts included in learning samples used for machine learning have characteristics similar to those of learning alerts. Probability is high. That is, there is a low possibility that a false negative occurs in an alert having a high similarity to a learning alert.
- alerts with low similarity to learning alerts are likely to have different characteristics from learning alerts. Therefore, when compared with an alert having a high similarity to a learning alert, a possibility that a false negative occurs in an alert having a low similarity to a learning alert is increased.
- the alert presented to the operator is an alert having a low similarity to the learning alert.
- the alert extraction in the technique described in Non-Patent Document 1 is a random extraction. Random extraction is extraction that is not related to machine learning.
- the technique described in Non-Patent Document 1 extracts an alert as an alert to be presented to an operator without considering similarity with a learning alert. That is, the system described in Non-Patent Document 1 may present an inappropriate alert as an alert to be presented to the operator.
- Non-Patent Document 1 has a problem that an appropriate alert cannot be presented to the operator.
- An object of the present invention is to provide an information processing apparatus, an information processing system, an information processing method, and a storage medium that solve the above-described problems and create alert information that can be appropriately presented to an operator.
- An information processing apparatus includes first alert information including a first alert that has already been received and information related to the first alert, a second alert that has been newly received, and a second alert that has been newly received.
- a dissimilarity calculating means for calculating a dissimilarity that is a distance from the second alert information including information related to the information, and a classification relating to detection of the first alert information by applying machine learning to the first alert information
- a machine learning means for generating a classifier that determines a classification result that is a determination result of the above, and applying the classifier to the second alert information to determine the classification result, and a false detection indicating that the determination result is erroneous
- the dissimilarity is less than a predetermined threshold
- information indicating that the determination result is set as the classification result of the second alert information and the information indicating the presentation of the second alert information is not required to be presented And set the judgment result In the case of correct detection indicating correct detection, or when the determination result is false detection indicating erroneous detection
- An information processing system is directed to the above information processing apparatus, alert display means for receiving first alert information from the information processing apparatus and displaying the classification result, and the classification result in the displayed alert information.
- a data processing method includes a first alert that has already been received and information related to the first alert.
- a dissimilarity that is a distance between the first alert information including the second alert information including the information related to the second alert newly received and the second alert is calculated, and the first alert information is calculated.
- the determination result is set to the classification result of the second alert information when the determination result is false detection indicating that the detection result is incorrect and the dissimilarity is less than a predetermined threshold, If the information indicating the presentation of the alert information is set to information indicating that the presentation is unnecessary, and the determination result is a positive detection indicating correct detection, or the determination result is a false detection indicating erroneous detection, and When the dissimilarity is equal to or greater than a predetermined threshold, information indicating that presentation is necessary is set in the information indicating the presentation of the second alert information.
- the storage medium includes a first alert information including a first alert that has already been received and information related to the first alert, a second alert that is newly received, and a second alert. It is a determination result of classification related to detection of the first alert information by applying dissimilarity, which is a distance from the second alert information including related information, and machine learning to the first alert information.
- a process of generating a classifier for determining a classification result, a process of determining a classification result by applying the classifier to the second alert information, a false detection indicating that the determination result is erroneous, and a dissimilarity A determination result is set in the classification result of the second alert information, and information indicating that presentation is unnecessary is set in the information indicating the presentation of the second alert information, If the judgment result is correct If the detection result is positive detection, or the determination result is false detection indicating erroneous detection, and the dissimilarity is equal to or greater than a predetermined threshold, the information indicating the second alert information must be presented.
- a program for causing a computer to execute processing for setting information indicating that the information is stored is stored.
- FIG. 1 is a block diagram showing an example of the configuration of an information processing system including an information processing apparatus according to the first embodiment of the present invention.
- FIG. 2 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the first embodiment.
- FIG. 3 is a block diagram illustrating an example of the configuration of the presentation device according to the first embodiment.
- FIG. 4 is a flowchart illustrating an example of an operation of machine learning in the machine learning unit according to the first embodiment.
- FIG. 5 is a flowchart illustrating an example of the dissimilarity calculation operation in the dissimilarity calculation unit according to the first embodiment.
- FIG. 6 is a flowchart illustrating an example of an operation for updating alert information in the determination unit according to the first embodiment.
- FIG. 1 is a block diagram showing an example of the configuration of an information processing system including an information processing apparatus according to the first embodiment of the present invention.
- FIG. 2 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the first embodiment.
- FIG. 7 is a sequence diagram illustrating an example of an operation for setting a classification result in alert information in the information processing system according to the first embodiment.
- FIG. 8 is a diagram illustrating an example of alert information used in the description of the first embodiment.
- FIG. 9 is a block diagram illustrating an example of a configuration of an information processing system including the information processing apparatus according to the second embodiment.
- FIG. 10 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the second embodiment.
- FIG. 11 is a flowchart illustrating an example of the operation of the re-determination unit according to the second embodiment.
- FIG. 12 is a block diagram illustrating an example of a configuration of an information processing system including an information processing apparatus according to the third embodiment.
- FIG. 13 is a block diagram illustrating an example of the configuration of the information processing apparatus according to the third embodiment.
- FIG. 14 is a flowchart illustrating an example of the operation of the dissimilarity calculation unit according to the third embodiment.
- FIG. 15 is a block diagram illustrating an exemplary configuration of an information processing apparatus according to a modification.
- FIG. 16 is a block diagram illustrating an exemplary configuration of an information processing apparatus according to a modification.
- Alert information is information including an alert (the alert includes information related to the communication that caused the alert) and information related to the alert (for example, an alert determination result).
- FIG. 8 is a diagram showing an example of alert information 500 used in the following description. In FIG. 8, each row indicates each alert information 500.
- the alert information 500 includes a detection time 501, a transmission source IP (Internet Protocol) address 502, a transmission destination IP address 503, a detection rule identifier 504, a presentation flag 505, and a classification result 506. Communication information 507.
- a transmission source IP Internet Protocol
- the detection time 501, the transmission source IP address 502, the transmission destination IP address 503, the detection rule identifier 504, and the communication information 507 are information relating to the communication that caused the alert. That is, the monitoring device that monitors the network to be monitored transmits these pieces of information as an alert to the information processing device according to each embodiment.
- the alert may include other information.
- the alert may include source and destination port numbers.
- the alert may not include a part of the information.
- presentation flag 505 and the classification result 506 are information related to the alert.
- the detection time 501 is the time when the monitoring device detects an alert.
- the transmission source IP address 502 is an address of a transmission source device (for example, a transmission source host) in communication that caused the alert.
- the transmission destination IP address 503 is an address of a transmission destination device (for example, transmission destination host) in the communication that caused the alert.
- the detection rule identifier 504 is an identifier for identifying a rule used when the monitoring device detects an alert.
- Communication information 507 is information communicated in the communication that caused the alert.
- the communication information 507 is, for example, a byte string included as a payload of a packet used for communication, a character string included in an HTTP (Hyper Text Transfer Transfer Protocol) request, or an HTTP response.
- HTTP Hyper Text Transfer Transfer Protocol
- the presentation flag 505 is information indicating whether or not the alert information 500 needs to be presented to the operator in the presentation device.
- the machine learning unit of the information processing apparatus sets the value of the presentation flag 505.
- the value of the presentation flag 505 is not particularly limited. Therefore, in the following description, the value “1” in the presentation flag 505 indicates that it is necessary to present to the operator, and the value “0” in the presentation flag 505 indicates that it is not necessary to present to the operator. And That is, the machine learning unit of each information processing apparatus sets “1 (required)” in the presentation flag 505 when the operator needs to be presented, and “0” in the presentation flag 505 when the operator does not need to be presented. (Unnecessary) ”is set. The initial value of the presentation flag 505 is “1 (required)”.
- the classification result 506 is information that classifies alert detection. That is, the classification result 506 is information indicating whether the alert information 500 is detected correctly or erroneously. Further, the classification result 506 may include information indicating the subject that performed the classification. For example, the result classified by the information processing apparatus described below may include information indicating this (hereinafter referred to as “/ automatic”).
- the classification result 506 is any one of the values indicating “correct detection”, “false detection”, “false detection / automatic”, or “blank (not shown or not set)”. Is set.
- the classification result 506 may hold the above value as a numerical value, or may hold a value other than a numerical value (for example, a character string).
- the initial value of the classification result 506 is “blank”.
- the information processing apparatus may include a process that makes the determination of the other person unnecessary for “correction detection”.
- the classification result 506 includes information corresponding to “correct detection / automatic”.
- the information included in the alert information 500 need not be limited to the above information.
- the alert information 500 may include other information.
- the alert information 500 may not include some information as long as the operation described below can be realized.
- FIG. 1 is a block diagram illustrating an example of a configuration of an information processing system 10 including an information processing apparatus 20 according to the first embodiment of the present invention.
- the information processing system 10 includes an information processing device 20, a presentation device 30, a network 40, and a monitoring device 50.
- the network 40 is a monitoring target in the present embodiment, that is, in the information processing system 10.
- the network 40 according to the present embodiment is not particularly limited.
- the network 40 may be an in-house network limited to use within a predetermined range.
- the network 40 is not limited to a physical network, and may be a logical network such as a VLAN (Virtual Local Area Network). Therefore, detailed description of the network 40 is omitted.
- VLAN Virtual Local Area Network
- the monitoring device 50 monitors communication information 507 in the network 40.
- the monitoring device 50 detects a communication abnormality occurring in the network 40 based on a predetermined detection rule.
- the communication abnormality occurring in the network 40 is, for example, an abnormality that occurs due to an external attack on the network 40 or the operation of the network 40.
- the monitoring apparatus 50 notifies the information processing apparatus 20 of a detection result (detected abnormality) as an alert.
- the monitoring device 50 according to the present embodiment is not particularly limited, and is a device that monitors a commonly used network 40. Therefore, detailed description of the monitoring device 50 is omitted.
- the presentation device 30 presents information (for example, alert information 500) analyzed by the information processing device 20 to an operator of the information processing system 10.
- the presentation device 30 transmits an instruction from the operator to the information processing device 20.
- the instruction is, for example, a determination result of the alert information 500 (for example, a value set in the classification result 506).
- FIG. 3 is a block diagram showing an example of the configuration of the presentation device 30 according to the first embodiment.
- the presentation device 30 includes an alert display unit 31 and an input unit 32.
- the alert display unit 31 receives information (for example, alert information 500) stored in the information processing apparatus 20 from the information processing apparatus 20. Then, the alert display unit 31 displays the received information (for example, alert information 500).
- the information displayed by the alert display unit 31 is information for an operator to input an instruction to the input unit 32. Therefore, the alert display unit 31 may display other information in addition to the information received from the information processing device 20. Or alert display part 31 does not need to display information unnecessary for an operator's judgment. Further, the alert display unit 31 selects and displays information to be displayed based on information (for example, the presentation flag 505, the classification result 506, or the detection time 501) included in the information received from the information processing device 20. May be.
- the input unit 32 receives an input (instruction) from the operator and transmits it to the information processing apparatus 20.
- the operator input is information instructing to change information (for example, the classification result 506) included in the alert information 500.
- the input unit 32 receives an input (correction detection or erroneous detection) of the classification result 506 that is an operator determination result for the displayed alert information 500 by the alert display unit 31.
- the input unit 32 transmits the received operator input (instruction) to the information processing apparatus 20.
- the information processing apparatus 20 stores information (instructions) received from the presentation apparatus 30 in the alert information 500.
- the information processing apparatus 20 receives the classification result 506 from the presentation apparatus 30, the information processing apparatus 20 stores the value of the received classification result 506 in the classification result 506 of the alert information storage unit 23 described later. .
- FIG. 2 is a block diagram illustrating an example of the configuration of the information processing apparatus 20 according to the first embodiment.
- the information processing apparatus 20 includes a machine learning unit 21, a dissimilarity calculation unit 22, an alert information storage unit 23, and a determination unit 26.
- the determination unit 26 receives the alert detected by the monitoring device 50 and stores the received alert and information related to the alert (that is, “alert information 500”) in the alert information storage unit 23.
- alert information 500 corresponding to an alert (latest alert) newly received from the monitoring device 50 is distinguished from other alert information 500
- the alert information 500 excluding the alert information (A) may be referred to as first alert information.
- the alert information (A) may be referred to as second alert information.
- the determination part 26 receives a new alert, it notifies that each structure received the alert as needed.
- the determination unit 26 updates the value of the alert information 500 based on the determination result in the machine learning unit 21 described later and the dissimilarity calculated by the dissimilarity calculation unit 22.
- the determination unit 26 receives a request from the presentation device 30 and transmits information corresponding to the request to the presentation device 30. In addition, the determination unit 26 receives information from the presentation device 30 and stores the received information in the alert information storage unit 23.
- the machine learning unit 21 performs machine learning on the classification result 506 of the alert information 500. More specifically, the machine learning unit 21 uses the alert information 500 in which the classification result 506 is not blank (not entered or not set) in the alert information 500 stored in the alert information storage unit 23 as a machine learning learning sample. Execute machine learning using. That is, the machine learning unit 21 performs machine learning of the classification result 506 using the set classification result 506. Further, when the classification result 506 stores information indicating the classification performed by the own device (“/ automatic”), the machine learning unit 21 may exclude the classification result 506 classified by the own device from the learning sample. Good. The machine learning unit 21 determines whether the alert information 500 is a positive detection or a false detection based on the machine learning.
- the determination unit 26 stores the determination result of the machine learning unit 21 in the classification result 506 of the alert information 500 stored in the alert information storage unit 23.
- the classification result 506 stores information (“/ automatic”) indicating that the classification is performed by the own apparatus
- the determination unit 26 indicates that the classification result is classified by the own apparatus (“/ automatic”).
- the classification result 506 is saved.
- the information processing apparatus 20 stores the classification result 506 (that is, “false detection / automatic”) indicating that it has been set by its own apparatus regarding false detection.
- the dissimilarity calculation unit 22 calculates dissimilarity between the alert information (A) newly received by the information processing apparatus 20 and a part or all of the alert information 500 stored in the alert information storage unit 23. .
- the calculation of the dissimilarity in the dissimilarity calculating unit 22 will be described in detail later.
- the alert information storage unit 23 stores alert information 500.
- FIG. 4 is a flowchart illustrating an example of an operation of machine learning in the machine learning unit 21 according to the first embodiment.
- the machine learning unit 21 first extracts the alert information 500 used as a learning sample from the alert information 500 stored in the alert information storage unit 23 (step S101).
- the alert information 500 used as the learning sample is preferably the alert information 500 in which the classification result 506 is set based on other than the information processing apparatus 20.
- the information processing apparatus 20 uses the alert information 500 set by receiving the classification result 506 from the operator as a learning sample. Therefore, the learning sample in the description of the present embodiment is the alert information 500 in which the classification result 506 is “correct detection” or “false detection”.
- the machine learning unit 21 calculates a feature vector based on the learning sample (step S102).
- the “feature vector” is a vector having information for comparing dissimilarities as elements.
- the feature vector is usually a multidimensional vector calculated from a learning sample. One feature vector is calculated for one learning sample.
- the machine learning unit 21 may use a general feature vector calculation method as a feature vector calculation method used in the present embodiment.
- the machine learning unit 21 includes the IP address classification, the host classification, the detection rule identifier, or the number of alerts belonging to the same classification in a certain time width included in the alert information 500 described in Non-Patent Document 1. You may use the feature vector calculated using.
- the machine learning unit 21 may convert the communication information 507 included in the alert information 500 into a vector format and use it as a feature vector. Note that the machine learning unit 21 may count the appearance frequency of N-grams, for example, in order to convert the communication information 507 into a vector format.
- the above feature vector is an example of a feature vector that can be used by the machine learning unit 21.
- the above description does not limit the feature vectors that can be used by the machine learning unit 21 according to the present embodiment to the above feature vectors.
- the machine learning unit 21 can use a value obtained by applying information included in the alert information 500 to a mechanical calculation process as a feature vector. Further, the machine learning unit 21 may combine the above feature vectors.
- the machine learning unit 21 performs machine learning using the learning sample classification result 506 as a teacher signal. Then, the machine learning unit 21 creates or updates a classifier as a result of machine learning (step S103).
- the machine learning unit 21 can use a general machine learning algorithm as the machine learning here. For example, the machine learning unit 21 may use decision tree learning, support vector machine, or ensemble learning.
- the classifier created by the machine learning unit 21 uses the feature vector to determine whether the alert information 500 corresponding to the feature vector is a positive detection or a false detection.
- machine learning unit 21 may use either batch learning or sequential learning if the learning algorithm used allows it.
- the machine learning unit 21 When employing batch learning, the machine learning unit 21 creates or updates a classifier by executing the above machine learning process at a predetermined timing or a predetermined time interval.
- the machine learning unit 21 When adopting sequential learning, the machine learning unit 21 creates or updates a classifier at a timing when the classification result 506 of the alert information 500 stored in the alert information storage unit 23 is changed.
- FIG. 5 is a flowchart illustrating an example of the dissimilarity calculation operation in the dissimilarity calculation unit 22 according to the first embodiment.
- the dissimilarity calculation unit 22 starts the operation at the timing when the alert information (A) for the new alert is received (step S201). Specifically, the determination unit 26 receives an alert from the monitoring device 50, adds necessary information to generate alert information 500 (alert information (A)), and stores the alert information 500 as the alert information 500 in the alert information storage unit 23. save. Then, the determination unit 26 notifies the dissimilarity calculation unit 22 that a new alert has been received. Based on this notification, the dissimilarity calculation unit 22 acquires the alert information (A) stored in the alert information storage unit 23.
- the dissimilarity calculation unit 22 acquires alert information 500 other than the alert information (A) from the alert information storage unit 23, that is, the alert information 500 to be compared (step S202).
- the alert information 500 excluding the alert information (A) is past alert information 500 with respect to the alert information (A). That is, the alert information 500 to be compared is the past (stored) alert information 500.
- the dissimilarity calculation unit 22 may acquire all the alert information 500 stored in the alert information storage unit 23. Alternatively, the dissimilarity calculation unit 22 may acquire part of the alert information 500 stored in the alert information storage unit 23. For example, the dissimilarity calculation unit 22 may acquire the alert information 500 that is used to detect the alert information (A) and is detected using the same detection rule as the detection rule. Specifically, the dissimilarity calculation unit 22 may acquire the alert information 500 including the detection rule identifier 504 having the same value as the detection rule identifier 504 of the alert information (A). Alternatively, the dissimilarity calculation unit 22 may acquire the alert information 500 in which the classification result 506 included in the alert information 500 is a predetermined value (for example, “correct detection” or “false detection”).
- a predetermined value for example, “correct detection” or “false detection”.
- the dissimilarity calculation unit 22 calculates dissimilarity between the alert information (A) and each past alert information 500 extracted (step S203).
- the “dissimilarity” is a distance between the alert information (A) and each extracted alert information 500. More specifically, the “dissimilarity” is a value corresponding to the distance between the alert information (A) and each extracted alert information 500. When the distance is large, the dissimilarity increases.
- the dissimilarity calculation unit 22 can use various distances as the dissimilarity.
- the dissimilarity calculation unit 22 may use a distance between feature vectors calculated based on the alert information 500 used by the machine learning unit 21 described above.
- the dissimilarity calculation unit 22 may use an edit distance between the communication information 507 included in each alert information 500 as the dissimilarity.
- the dissimilarity is an example of the dissimilarity according to the present embodiment.
- the dissimilarity according to the present embodiment is not limited to the above.
- the dissimilarity calculating unit 22 After calculating the dissimilarity with each alert information 500, the dissimilarity calculating unit 22 has the dissimilarity in a predetermined order (K) when arranged in ascending order among the calculated dissimilarities. Select. Then, the dissimilarity calculation unit 22 sets the selected dissimilarity as the dissimilarity between the alert information (A) and the past alert information 500 (step S204). In other words, the dissimilarity calculation unit 22 sets the dissimilarity of the alert information (A) corresponding to the newly received alert in a predetermined order (K) from the smallest of the calculated dissimilarities. Adopt dissimilarity. That is, the dissimilarity of the alert information (A) is a dissimilarity with the past alert information 500 corresponding to the vicinity of K of the alert information (A).
- the predetermined order (K) is a value set in advance in the dissimilarity calculation unit 22.
- FIG. 6 is a flowchart illustrating an example of an operation for updating the alert information 500 in the determination unit 26 according to the present embodiment.
- the determination unit 26 starts the following operation when receiving a new alert (step S301).
- the determination unit 26 creates alert information 500 (in this case, alert information (A)) corresponding to the alert, and stores the created alert information 500 in the alert information storage unit 23.
- the determination part 26 performs the following operation
- the determination unit 26 determines alert information (A) using the classifier created by the machine learning unit 21 described with reference to FIG. 4 (step S302). More specifically, the determination unit 26 determines using the classifier as follows.
- the determination unit 26 calculates a feature vector based on the alert information (A). This operation is the same as step S102 shown in FIG. Therefore, the determination unit 26 may request the machine learning unit 21 to calculate a feature vector. Next, the determination unit 26 inputs the calculated feature vector to the classifier. And the determination part 26 acquires the determination result (a right detection or a false detection) of alert information (A) as an output of a classifier. Note that the determination unit 26 may request the machine learning unit 21 to perform feature vector determination processing using a classifier in addition to calculation of feature vectors. That is, in the information processing apparatus 20, the machine learning unit 21 may execute step S302.
- the determination part 26 determines whether the determination result of a classifier is a positive detection (step S303).
- the determination unit 26 sets “1 (required)” in the presentation flag 505 of the alert information (A) (step S306).
- the determination unit 26 acquires the dissimilarity with respect to the alert information (A) calculated by the dissimilarity calculation unit 22 described with reference to FIG. Step S304).
- the determination part 26 determines whether a dissimilarity is more than a predetermined threshold value (step S305).
- the threshold used here is a value set in the determination unit 26 in advance.
- the determination unit 26 sets “1 (necessary)” in the presentation flag 505 (step S306).
- the determination unit 26 sets “0 (unnecessary)” to the presentation flag 505 (step S307). Further, the determination unit 26 sets the classification result 506 to “false detection / automatic” in order to indicate that the own device (mechanically) has determined the false detection (step S308).
- FIG. 7 is a sequence diagram illustrating an example of an operation of storing the classification result 506 in the alert information 500 in the information processing system 10 according to the present embodiment.
- the alert display unit 31 of the presentation device 30 requests the information processing device 20 to transmit the alert information 500 (S401).
- the determination unit 26 of the information processing device 20 transmits the alert information 500 stored in the alert information storage unit 23 to the presentation device 30 based on the request from the alert display unit 31 (S402).
- the alert display unit 31 of the presentation device 30 presents the received alert information 500 (S403).
- the alert display unit 31 may display all received alert information 500. However, in order to clarify the display, it is desirable that the alert display unit 31 selects and displays the alert information 500 whose presentation flag 505 is “1 (necessary)” as the alert information 500 to be displayed. Furthermore, the alert display unit 31 preferably selects and displays the alert information 500 for which the operator's judgment is not set, specifically, the alert information 500 whose classification result 506 is “blank”. For example, in the case of the alert information 500 shown in FIG. 8, the alert display unit 31 desirably displays the alert information 500 on the sixth line.
- the presentation device 30 can display appropriate alert information 500 for the operator based on the alert information 500 created by the information processing device 20.
- the machine learning unit 21 of the information processing apparatus 20 may transmit alert information 500 that satisfies the above conditions.
- the input unit 32 of the presentation device 30 receives an instruction (correct detection or erroneous detection) for the displayed alert information 500 from the operator (S404).
- the input unit 32 sends the received instruction (correction detection or erroneous detection) to the information processing apparatus 20 (S405).
- the determination unit 26 of the information processing apparatus 20 sets (stores) the received instruction in the alert information 500 stored in the alert information storage unit 23 (S406).
- the information processing apparatus 20 can produce an effect of creating alert information 500 that can be appropriately presented to an operator.
- the machine learning unit 21 of the information processing apparatus 20 generates a classifier used for classifying the alert information 500 based on the machine learning. Further, the dissimilarity calculation unit 22 calculates the dissimilarity between the received alert information (A) and the past alert information 500. Then, the determination unit 26 determines whether or not it is necessary to present the alert information 500 to the operator based on the classification result 506 and the dissimilarity based on the classifier, and the determination result is indicated by the presentation flag 505 of the alert information 500. It is for setting to.
- the presentation device 30 that presents the alert information 500 to the operator uses the presentation flag 505 in addition to the classification result 506 based on the classifier generated by the machine learning unit 21. be able to. That is, the presentation device 30 can select the alert information 500 to be presented based on the presentation flag 505 in addition to the classification result 506. More specifically, the presentation device 30 can select the alert information 500 having low similarity to the past alert information 500 based on the presentation flag 505 and present it to the operator.
- the information processing apparatus 20 facilitates selection of the alert information 500 having characteristics different from the past alert information 500.
- the information processing system 10 including the information processing apparatus 20 can obtain the classification result 506 for the alert information 500 having characteristics different from the past alert information 500. As a result, the information processing system 10 can reduce the occurrence of false negatives in the alert information 500.
- FIG. 9 is a block 9 showing an example of the configuration of the information processing system 11 including the information processing apparatus 60 according to the second embodiment.
- the information processing system 11 is different from the information processing system 10 according to the first embodiment in that an information processing device 60 is included instead of the information processing device 20. Therefore, the description of the same configuration and operation as in the first embodiment will be omitted, and the configuration and operation related to the present embodiment will be described.
- FIG. 10 is a block diagram illustrating an example of the configuration of the information processing apparatus 60 according to the second embodiment.
- the information processing device 60 includes a redetermination unit 24 in addition to the configuration of the information processing device 20 of the first embodiment. Therefore, a detailed description of the configuration and operation similar to those of the first embodiment will be omitted, and the configuration and operation unique to the present embodiment will be mainly described.
- the re-determination unit 24 re-determines the alert information 500 stored in the alert information storage unit 23 using the classifier created (updated) by the machine learning unit 21.
- FIG. 11 is a flowchart showing an example of the operation of the re-determination unit 24 according to the second embodiment.
- the re-determination unit 24 first extracts the alert information 500 whose classification result 506 is “false detection / automatic” from the alert information 500 stored in the alert information storage unit 23 (step S601). That is, the re-determination unit 24 sets the alert information 500 classified as erroneous detection by its own device as a processing target. In other words, the re-determination unit 24 extracts the alert information 500 that re-determines the classification result 506.
- the alert information 500 for re-determining the classification result 506 may be referred to as third alert information.
- the re-determination unit 24 calculates a feature vector for the extracted alert information 500 (step S602).
- the re-determination unit 24 applies the calculated feature vector to the updated classifier, and acquires the determination result. That is, the re-determination unit 24 re-determines the feature vector using the updated classifier (step S603).
- the re-determination unit 24 updates the presentation flag 505 of the alert information 500 to “1 (required)” (step S604).
- the determination unit 26 may be configured to include the re-determination unit 24 in order to unify the functions for the alert information 500.
- the re-determination unit 24 does not update the presentation flag 505 of the alert information 500.
- the information processing apparatus 60 updates the presentation flag 505 based on the operation of the re-determination unit 24.
- the information processing apparatus 60 can transmit the alert information 500 in which the presentation flag 505 is updated to the presentation apparatus 30.
- the alert display unit 31 of the presentation device 30 displays alert information 500 determined as “correct detection” after the update to the operator.
- the information processing apparatus 20 can receive an operator's instruction for the updated alert information 500 from the presentation apparatus 30.
- the machine learning unit 21 updates the classifier. Therefore, the information processing apparatus 60 can update the classification result 506 using the updated classifier. Therefore, the information processing apparatus 60 can set the classification result 506 more accurately than the information processing apparatus 20.
- the re-determination unit 24 uses the alert information 500 with the classification result 506 of “false detection / automatic” as a target of re-determination.
- the redetermination unit 24 may further limit the target alert information 500.
- the re-determination unit 24 may use, for example, the alert information 500 detected after a predetermined time as a re-determination target using the detection time 501 included in the alert information 500.
- the machine learning part 21 may exclude the classification result 506 classified by the own apparatus from the learning sample in the machine learning. This is because it may be desirable for the machine learning unit 21 to avoid re-referencing the determination result (classification result) based on the machine learning of the own device in the re-determination.
- the information processing apparatus 60 can achieve the effect of achieving a more accurate determination.
- the re-determination unit 24 included in the information processing apparatus 60 re-determines the alert information 500 classified as erroneous detection by the own apparatus by using the updated classifier.
- the updated classifier is created in a state where learning samples in the machine learning unit 21 are expanded. Therefore, it can be expected that the updated classifier has improved classification accuracy compared to the previous classifier. Therefore, the re-determination classification result 506 using the updated classifier in the re-determination unit 24 is more accurate than the classification result 506 using the previous classifier. is there.
- the method for calculating the dissimilarity in the information processing apparatus 20 need not be limited to the method already described.
- the information processing apparatus 20 may use information representing the characteristics of the alert information 500 using the dissimilarity calculation method.
- information representing the characteristics of the alert information 500 may be used as a feature of the alert information 500.
- FIG. 12 is a block diagram illustrating an example of the configuration of the information processing system 12 including the information processing apparatus 70 according to the third embodiment.
- the information processing system 12 is different from the information processing system 10 according to the first embodiment in that an information processing device 70 is included instead of the information processing device 20. Therefore, the description of the same configuration and operation as in the first embodiment will be omitted, and the configuration and operation related to the present embodiment will be described.
- FIG. 13 is a block diagram showing an example of the configuration of the information processing apparatus 70 according to the third embodiment.
- the information processing device 70 is different from the information processing device 20 in the first embodiment in that a dissimilarity calculation unit 25 is included instead of the dissimilarity calculation unit 22. Therefore, detailed description of the configuration and operation similar to those of the first embodiment will be omitted, and the configuration and operation unique to the present embodiment will be described. That is, the dissimilarity calculation unit 25 will be described.
- the dissimilarity calculation unit 25 calculates the dissimilarity between the alert information (A) corresponding to the newly received alert and the past alert information 500, similarly to the dissimilarity calculation unit 22. However, the dissimilarity calculating unit 25 uses a different method from the dissimilarity calculating unit 22 for calculating the dissimilarity. That is, the dissimilarity calculation unit 25 calculates distribution information in the feature vector calculated based on the alert information 500, and uses the calculated distribution information to calculate the dissimilarity with respect to the feature vector calculated based on the alert information (A). Is calculated.
- the distribution information is information that summarizes the distribution of the alert information 500 on the feature vector space.
- the dissimilarity calculation unit 25 uses the normal ( ⁇ ) and variance (or covariance matrix ( ⁇ )) in the distribution to determine its normality.
- the relationship of information to be evaluated with respect to the distribution can be calculated.
- the relationship calculated by the dissimilarity calculation unit 25 is, for example, the Mahalanobis distance indicating the degree of disagreement (dissimilarity).
- the dissimilarity calculation unit 25 obtains the average ( ⁇ ) and the covariance matrix ( ⁇ ) of the alert information 500 as the distribution information.
- the feature vector in this description may be, for example, the feature vector in the machine learning of the first embodiment already described, but is not limited thereto.
- the feature vector in this description may be a histogram (H).
- the dissimilarity calculation unit 25 may store the calculated distribution information in a storage unit (not shown) when the calculated distribution information is used for an operation for calculating the next dissimilarity.
- FIG. 14 is a flowchart showing an example of the operation of the dissimilarity calculation unit 25 according to the third embodiment.
- the distribution information used by the dissimilarity calculation unit 25 according to the present embodiment is not particularly limited.
- the distribution information is a parameter that determines a distribution such as an average or a variance, for example.
- the dissimilarity calculation unit 25 uses the average value ( ⁇ ) of the histogram (H) representing the number of appearances of the character string included in the communication information 507 in the alert information (A) and the variance as the distribution information.
- ⁇ a covariance matrix
- the dissimilarity calculation unit 25 has already calculated and stored distribution information (average value ( ⁇ ) and variance-covariance matrix ( ⁇ )) for the past alert information 500.
- the dissimilarity calculation unit 25 uses the Mahalanobis distance (D) as an example of the distance.
- the dissimilarity calculation unit 25 starts the operation at the timing when the alert information (A) is received, similarly to the dissimilarity calculation unit 22 (step S201).
- the dissimilarity calculation unit 25 creates a histogram (H) for the character string included in the communication information 507 in the alert information (A) (step S205).
- the dissimilarity calculation unit 25 uses the created histogram (H) and the distribution information (average value ( ⁇ ) and variance-covariance matrix ( ⁇ )) for the past alert information 500 stored.
- the Mahalanobis distance (D) is calculated (step S206).
- the Mahalanobis distance (D) calculated here is the dissimilarity in the present embodiment.
- the dissimilarity calculation unit 25 updates the distribution information (average value ( ⁇ ) and variance-covariance matrix ( ⁇ )) for the past alert information 500 using the created histogram (H) (step S207). .
- the dissimilarity calculation unit 25 of the present embodiment receives the alert information (A)
- the dissimilarity calculation unit 25 updates the distribution information used for calculating the dissimilarity. Based on this update, the dissimilarity calculation unit 25 can approximately calculate the variance-covariance matrix ( ⁇ ) for the past alert information 500.
- the information processing apparatus 70 may include a function corresponding to the redetermination unit 24 according to the second embodiment.
- the dissimilarity calculation unit 25 uses the histogram (H) of the communication information 507 included in the alert information 500 as the dissimilarity.
- the dissimilarity calculation unit 25 may use other information as information representing the characteristics of the alert information 500. For example, the dissimilarity calculation unit 25 estimates a probability distribution for the communication information 507 based on the communication information 507 of the past alert information 500. And the dissimilarity calculation part 25 may use the low appearance probability of alert information (A) in the estimated probability distribution as the dissimilarity.
- Information representing the characteristics of the alert information 500 is information in which the alert information 500 is collected. Therefore, in general, the number of pieces of information representing the characteristics of the alert information 500 is smaller than the number of alert information 500. Therefore, the dissimilarity calculation unit 25 has a smaller amount of information to process than the dissimilarity calculation unit 22. As a result, the dissimilarity calculation unit 25 can reduce the processing cost compared to the dissimilarity calculation unit 22.
- the information processing apparatus 70 according to the third embodiment can achieve the effect of efficiently calculating the dissimilarity in addition to the effect of the first embodiment.
- the dissimilarity calculation unit 22 of the information processing apparatus 20 needs to calculate the dissimilarity between the alert information (A) and the plurality of alert information 500.
- the dissimilarity calculation unit 25 calculates the dissimilarity of the alert information (A) using the distribution information of the past alert information 500.
- the distribution information of the past alert information 500 is information obtained by collecting a plurality of alert information 500 into a small number of information. That is, the dissimilarity calculation unit 25 calculates the dissimilarity of the alert information (A) by using less information compared to the alert information 500.
- the information processing apparatus 20, the information processing apparatus 60, and the information processing apparatus 70 described above are configured as follows.
- each component of the information processing apparatus 20 may be configured with a hardware circuit.
- the information processing apparatus 20 may be configured using a plurality of apparatuses in which each component is connected via a network (not shown).
- the information processing apparatus 20 may be configured with the alert information storage unit 23 as an external storage device (not shown).
- FIG. 16 is a block diagram illustrating an example of the configuration of the information processing apparatus 90 according to the first modification of the information processing apparatus 20.
- the information processing apparatus 90 is configured with the alert information storage unit 23 as an external storage device connected via a network (not shown).
- the information processing apparatus 90 includes a machine learning unit 21, a dissimilarity calculation unit 22, and a determination unit 26.
- the information processing apparatus 90 configured in this way can obtain the same effects as the information processing apparatus 20.
- each configuration included in the information processing device 90 can operate in the same manner as the configuration of the information processing device 20 using the alert information storage unit 23 provided in the external storage device.
- the information processing apparatus 90 is the minimum configuration in the embodiment of the present invention.
- the information processing apparatus 20 may be configured with a plurality of components by a single piece of hardware.
- the information processing apparatus 20 may be realized as a computer apparatus including a CPU (Central Processing Unit), a ROM (Read Only Memory), and a RAM (Random Access Memory).
- the information processing device 20 may be realized as a computer device that further includes an input / output connection circuit (IOC: Input / Output Circuit) and a network interface circuit (NIC: Network Interface Circuit).
- IOC Input / Output Circuit
- NIC Network Interface Circuit
- FIG. 15 is a block diagram illustrating an example of the configuration of the information processing apparatus 700 according to the second modification of the information processing apparatus 20. That is, FIG. 15 is a block diagram illustrating an example of the configuration of the information processing apparatus 700, which is an example when the information processing apparatus 20 is realized as the above-described computer apparatus.
- the information processing apparatus 700 includes a CPU 710, a ROM 720, a RAM 730, an internal storage device 740, an IOC 750, and a NIC 780, and constitutes a computer.
- CPU 710 reads a program from ROM 720.
- the CPU 710 controls the RAM 730, the internal storage device 740, the IOC 750, and the NIC 780 based on the read program.
- the computer including the CPU 710 controls these configurations and implements the functions as the machine learning unit 21, the dissimilarity calculation unit 22, and the determination unit 26 illustrated in FIG.
- the CPU 710 may use the RAM 730 or the internal storage device 740 as a temporary program storage when realizing each function.
- the CPU 710 may read a program included in the storage medium 790 storing the program so as to be readable by a computer using a storage medium reading device (not shown).
- the CPU 710 may receive a program from an external device (not shown) via the NIC 780, store the program in the RAM 730, and operate based on the stored program.
- ROM 720 stores a program executed by CPU 710 and fixed data.
- the ROM 720 is, for example, a P-ROM (Programmable-ROM) or a flash ROM.
- the RAM 730 temporarily stores programs executed by the CPU 710 and data.
- the RAM 730 is, for example, a D-RAM (Dynamic-RAM).
- the internal storage device 740 stores data and programs that the information processing device 700 stores for a long time. Further, the internal storage device 740 may operate as a temporary storage device for the CPU 710. The internal storage device 740 operates as the alert information storage unit 23.
- the internal storage device 740 is, for example, a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), or a disk array device.
- the ROM 720 and the internal storage device 740 are non-transitory storage media.
- the RAM 730 is a volatile storage medium.
- the CPU 710 can operate based on a program stored in the ROM 720, the internal storage device 740, or the RAM 730. That is, the CPU 710 can operate using a nonvolatile storage medium or a volatile storage medium.
- the IOC 750 mediates data between the CPU 710, the input device 760, and the display device 770.
- the IOC 750 is, for example, an IO interface card or a USB (Universal Serial Bus) card.
- the input device 760 is a device that receives an input instruction from an operator of the information processing apparatus 700.
- the input device 760 is, for example, a keyboard, a mouse, or a touch panel.
- the display device 770 is a device that displays information to the operator of the information processing apparatus 700.
- the display device 770 is a liquid crystal display, for example.
- the NIC 780 relays data exchange with an external device (not shown) via the network.
- the NIC 780 is, for example, a LAN card.
- the information processing apparatus 700 configured in this way can obtain the same effects as the information processing apparatus 20.
- First alert information including the first alert that has already been received and information related to the first alert, and second alert information that includes information related to the newly received second alert and second alert Dissimilarity calculating means for calculating dissimilarity that is a distance from Applying machine learning to the first alert information, generating a classifier that determines a classification result that is a classification determination result related to detection of the first alert information, and applying the classifier to the second alert information for classification A mood learning means to determine the results;
- the determination result is false detection indicating erroneous detection and the dissimilarity is less than a predetermined threshold
- the determination result is set in the classification result of the second alert information, and the second alert information is presented Set information indicating that presentation is not necessary in the information indicating
- the determination result is a positive detection indicating correct detection, or when the determination result is a false detection indicating erroneous detection and the dissimilarity is equal to or higher than a predetermined threshold, the second alert information is presented.
- An information processing apparatus comprising: determination means for setting information
- Machine learning means As machine learning, As the first alert information applied to machine learning, the first alert information in which the determined classification result other than the own device is set is extracted, The information processing apparatus according to appendix 1, wherein a classifier is created based on the classification result set in the extracted first alert information.
- Machine learning means A first feature vector is calculated based on the extracted first alert information, a classifier is generated by applying the calculated first feature vector to machine learning, and a second feature is generated based on the second alert information.
- the information processing apparatus according to appendix 2, wherein a vector is calculated and the second feature vector is applied to a classifier to determine a classification result.
- the judging means When the second alert information is received, the classification result of the second alert information is determined using a classifier, and the determination result which is another result is set in the second alert information.
- Information processing device When the second alert information is received, the classification result of the second alert information is determined using a classifier, and the determination result which is another result is set in the second alert information.
- the dissimilarity calculation means Any one of the supplementary notes 1 to 4, wherein a distance in a predetermined order is set as a dissimilarity when the distance between the second alert information and a part or all of the first alert information is arranged from a smaller one
- Dissimilarity calculation means The degree of low appearance probability of the communication information included in the second alert information with respect to the distribution of information included in the communication information included in the first alert information is calculated as the dissimilarity.
- the dissimilarity calculation means The information processing apparatus according to any one of appendices 1 to 4, wherein distribution information about the first alert information is calculated as dissimilarity, and a relationship of the second alert information with respect to the calculated distribution information is calculated.
- Dissimilarity calculation means The degree of low appearance probability of the communication information included in the second alert information with respect to the distribution of information included in the communication information included in the first alert information is calculated as the dissimilarity.
- Appendix 9 Extract third alert information for resetting the classification result from the first alert information, Calculating a feature vector of the third alert information;
- the information processing apparatus according to any one of appendices 1 to 8, further including a re-determination unit that re-determines the classification result of the third alert information by applying the feature vector of the third alert information to the classifier.
- Appendix 10 The information processing apparatus according to any one of appendices 1 to 9, Alert display means for receiving first alert information from the information processing apparatus and displaying a classification result;
- An information processing system comprising: an input unit that receives an input for a classification result in the displayed alert information and transmits the input to the information processing device.
- First alert information including the first alert that has already been received and information related to the first alert, and second alert information that includes information related to the newly received second alert and second alert Dissimilarity, which is the distance to Applying machine learning to the first alert information, generating a classifier that determines a classification result that is a classification determination result related to detection of the first alert information, Apply a classifier to the second alert information to determine the classification result,
- the determination result is false detection indicating erroneous detection and the dissimilarity is less than a predetermined threshold
- the determination result is set in the classification result of the second alert information, and the second alert information is presented Set information indicating that presentation is not necessary in the information indicating
- the determination result is a positive detection indicating correct detection, or when the determination result is a false detection indicating erroneous detection and the dissimilarity is equal to or higher than a predetermined threshold
- the second alert information is presented.
- First alert information including the first alert that has already been received and information related to the first alert, and second alert information that includes information related to the newly received second alert and second alert
- the determination result is false detection indicating erroneous detection and the dissimilarity is less than a predetermined threshold
- the determination result is set in the classification result of the second alert information, and the second alert information is presented
- a process of setting information indicating that presentation is not necessary in the information indicating When the determination result is a positive detection indicating correct detection, or when the determination result is a false detection indicating erroneous detection and the dissimilarity is equal to or higher than a predetermined threshold, the second alert information is presented.
- a computer-readable storage medium storing
- Information processing system 11 Information processing system 12 Information processing system 20 Information processing apparatus 21 Machine learning part 22 Dissimilarity degree calculation part 23 Alert information storage part 24 Re-determination part 25 Dissimilarity degree calculation part 26 Judgment part 30 Presentation apparatus 40 Network 50 Monitoring device 60 Information processing device 70 Information processing device 90 Information processing device 500 Alert information 501 Detection time 502 Transmission source IP address 503 Transmission destination IP address 504 Detection rule identifier 505 Presentation flag 506 Classification result 507 Communication information 700 Information processing device 710 CPU 720 ROM 730 RAM 740 Internal storage device 750 IOC 760 Input device 770 Display device 780 NIC 790 storage media
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Mathematical Physics (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Quality & Reliability (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
本発明の一形態におけるデータ処理方法は、既に受信した第1のアラートと第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出し、第1のアラート情報に機械学習を適用し、第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成し、第2のアラート情報に分類器を適用して分類結果を判定し、判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値未満の場合に、第2のアラート情報の分類結果に判定の結果を設定し、第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定し、判定結果が正しい検知を示す正検知の場合、又は、判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値以上の場合に、第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する。
まず、本発明における第1の実施形態に構成について説明する。
図1は、本発明における第1の実施形態に係る情報処理装置20を含む情報処理システム10の構成の一例を示すブロック図である。
次に、本実施形態に係る情報処理装置20の動作について、図面を参照して詳細に説明する。
次に、本実施形態の効果について説明する。
本発明における第2の実施形態は、正確性を高めるため、再判定を実行する。
まず、情報処理装置60の構成について図面を参照して説明する。
次に、図面を参照して、再判定部24の動作について説明する。
本実施形態に係る情報処理装置60は、第1の実施形態の効果に加え、より正確性の高い判定を実現するとの効果を奏することができる。
情報処理装置20における非類似度の算出手法は、既に説明した手法に限る必要はない。
まず、本実施形態の構成について図面を参照して説明する。
次に図面を参照して、非類似度算出部25の動作について詳細に説明する。
次に、本実施形態に係る情報処理装置70の効果について説明する。
以上の説明した情報処理装置20、情報処理装置60、及び、情報処理装置70(以下、まとめて、情報処理装置20として説明する)は、次のように構成される。
既に受信した第1のアラートと第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出する非類似度算出手段と、
第1のアラート情報に機械学習を適用し、第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成し、第2のアラート情報に分類器を適用して分類結果を判定する気概学習手段と、
判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値未満の場合に、第2のアラート情報の分類結果に判定の結果を設定し、第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定し、
判定結果が正しい検知を示す正検知の場合、又は、判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値以上の場合に、第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する判定手段と
を含む情報処理装置。
機械学習手段が、
機械学習として、
機械学習に適用する第1のアラート情報として、自装置以外の判断した分類結果が設定されている第1のアラート情報を抽出し、
抽出した第1のアラート情報に設定されている分類結果を基に分類器を作成する
付記1に記載の情報処理装置。
機械学習手段が、
抽出した第1のアラート情報を基に第1の特徴ベクトルを算出し、算出した第1の特徴ベクトルを機械学習に適用して分類器を生成し
第2のアラート情報を基に第2の特徴ベクトルを算出し、第2の特徴ベクトルを分類器に適用して、分類結果を判定する
付記2に記載の情報処理装置。
判定手段が、
第2のアラート情報を受信したときに、第2のアラート情報の分類結果を、分類器を用いて判定し他結果である判定結果を第2のアラート情報に設定する
付記2又は3に記載の情報処理装置。
非類似度算出手段が、
第2のアラート情報と、第1のアラート情報との一部又は全てとの距離を小さい方から並べた場合に所定の順番である距離を非類似度とする
付記1ないし4のいずれか1項に記載の情報処理装置。
非類似度算手段が、
第1アラート情報に含まれる通信情報に含まれる情報の分布に対する第2のアラート情報に含まれる通信情報の出現確率の低さの程度を非類似度として算出する
付記1ないし4のいずれか1項に記載の情報処理装置。
非類似度算出手段が、
非類似度として、第1のアラート情報についての分布情報を算出し、算出した分布情報に対する第2のアラート情報の関係を算出する
付記1ないし4のいずれか1項に記載の情報処理装置。
非類似度算手段が、
第1アラート情報に含まれる通信情報に含まれる情報の分布に対する第2のアラート情報に含まれる通信情報の出現確率の低さの程度を非類似度として算出する
付記1ないし4のいずれか1項に記載の情報処理装置。
第1のアラート情報の中から分類結果を再設定する第3のアラート情報を抽出し、
第3のアラート情報の特徴ベクトルを算出し、
第3のアラート情報の特徴ベクトルを分類器に適用して第3のアラート情報の分類結果を再判定する
再判定手段を
さらに含む付記1ないし8のいずれか1項に記載の情報処理装置。
付記1ないし9のいずれか1項に記載の情報処理装置と、
情報処理装置から第1のアラート情報を受信し、分類結果と表示するアラート表示手段と、
表示されたアラート情報における分類結果に対する入力を受信し、情報処理装置に送信する入力手段と
を含む提示装置と
を含む情報処理システム。
既に受信した第1のアラートと第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出し、
第1のアラート情報に機械学習を適用し、第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成し、
第2のアラート情報に分類器を適用して分類結果を判定し、
判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値未満の場合に、第2のアラート情報の分類結果に判定の結果を設定し、第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定し、
判定結果が正しい検知を示す正検知の場合、又は、判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値以上の場合に、第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する
情報処理方法。
既に受信した第1のアラートと第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出する処理と、
第1のアラート情報に機械学習を適用し、第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成する処理と、
第2のアラート情報に分類器を適用して分類結果を判定する処理、
判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値未満の場合に、第2のアラート情報の分類結果に判定の結果を設定し、第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定する処理と、
判定結果が正しい検知を示す正検知の場合、又は、判定結果が誤った検知を示す誤検知であり、かつ、非類似度が所定の閾値以上の場合に、第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する処理と
をコンピュータに実行させるプログラムをコンピュータ読み取り可能に記憶する記憶媒体。
11 情報処理システム
12 情報処理システム
20 情報処理装置
21 機械学習部
22 非類似度算出部
23 アラート情報記憶部
24 再判定部
25 非類似度算出部
26 判定部
30 提示装置
40 ネットワーク
50 監視装置
60 情報処理装置
70 情報処理装置
90 情報処理装置
500 アラート情報
501 検知時刻
502 送信元IPアドレス
503 送信先IPアドレス
504 検知ルール識別子
505 提示フラグ
506 分類結果
507 通信情報
700 情報処理装置
710 CPU
720 ROM
730 RAM
740 内部記憶装置
750 IOC
760 入力機器
770 表示機器
780 NIC
790 記憶媒体
Claims (10)
- 既に受信した第1のアラートと前記第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと前記第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出する非類似度算出手段と、
前記第1のアラート情報に機械学習を適用し、前記第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成し、前記第2のアラート情報に前記分類器を適用して分類結果を判定する機械学習手段と、
前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値未満の場合に、前記第2のアラート情報の前記分類結果に前記判定の結果を設定し、前記第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定し、前記判定結果が正しい検知を示す正検知の場合、又は、前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値以上の場合に、前記第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する判定手段と
を含む情報処理装置。 - 前記機械学習手段が、
前記機械学習として、
前記機械学習に適用する前記第1のアラート情報として、自装置以外の判断した前記分類結果が設定されている前記第1のアラート情報を抽出し、
前記抽出した第1のアラート情報に設定されている前記分類結果を基に前記分類器を作成する
請求項1に記載の情報処理装置。 - 前記機械学習手段が、
前記抽出した第1のアラート情報を基に第1の特徴ベクトルを算出し、算出した前記第1の特徴ベクトルを前記機械学習に適用して前記分類器を生成し、
前記第2のアラート情報を基に第2の特徴ベクトルを算出し、前記第2の特徴ベクトルを前記分類器に適用して、前記分類結果を判定する
請求項2に記載の情報処理装置。 - 前記判定手段が、
前記第2のアラート情報を受信したときに、前記第2のアラート情報の前記分類結果を、前記分類器を用いて判定した結果である前記判定結果を前記第2のアラート情報に設定する
請求項2又は3に記載の情報処理装置。 - 前記非類似度算出手段が、
前記第2のアラート情報と、前記第1のアラート情報との一部又は全てとの距離を小さい方から並べた場合に所定の順番である距離を前記非類似度とする
請求項1ないし4のいずれか1項に記載の情報処理装置。 - 前記非類似度算出手段が、
前記第1のアラート情報に含まれる通信情報に含まれる情報の分布に対する前記第2のアラート情報に含まれる通信情報の出現確率の低さの程度を前記非類似度として算出する
請求項1ないし4のいずれか1項に記載の情報処理装置。 - 前記第1のアラート情報の中から前記分類結果を再設定する第3のアラート情報を抽出し、
前記第3のアラート情報の特徴ベクトルを算出し、
前記第3のアラート情報の特徴ベクトルを前記分類器に適用して前記第3のアラート情報の前記分類結果を再判定する
再判定手段を
さらに含む請求項1ないし6のいずれか1項に記載の情報処理装置。 - 請求項1ないし7のいずれか1項に記載の情報処理装置と、
前記情報処理装置から前記第1のアラート情報を受信し、前記分類結果と前記第1のアラート情報とを表示するアラート表示手段と、
前記表示されたアラート情報における前記分類結果に対する入力を受信し、前記情報処理装置に送信する入力手段と
を含む提示装置と
を含む情報処理システム。 - 既に受信した第1のアラートと前記第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと前記第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出し、
前記第1のアラート情報に機械学習を適用し、前記第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成し、
前記第2のアラート情報に前記分類器を適用して分類結果を判定し、
前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値未満の場合に、前記第2のアラート情報の前記分類結果に前記判定の結果を設定し、前記第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定し、
前記判定結果が正しい検知を示す正検知の場合、又は、前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値以上の場合に、前記第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する
情報処理方法。 - 既に受信した第1のアラートと前記第1のアラートに関連する情報とを含む第1のアラート情報と、新たに受信した第2のアラートと前記第2のアラートに関連する情報を含む第2のアラート情報との距離である非類似度を算出する処理と、
前記第1のアラート情報に機械学習を適用し、前記第1のアラート情報の検知に関する分類の判定結果である分類結果を判定する分類器を生成する処理と、
前記第2のアラート情報に前記分類器を適用して分類結果を判定する処理、
前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値未満の場合に、前記第2のアラート情報の前記分類結果に前記判定の結果を設定し、前記第2のアラート情報の提示を示す情報に提示不要であることを示す情報を設定する処理と、
前記判定結果が正しい検知を示す正検知の場合、又は、前記判定結果が誤った検知を示す誤検知であり、かつ、前記非類似度が所定の閾値以上の場合に、前記第2のアラート情報の提示を示す情報に提示が必要であること示す情報を設定する処理と
をコンピュータに実行させるプログラムをコンピュータ読み取り可能に記憶する記憶媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/735,244 US11057399B2 (en) | 2015-06-26 | 2016-06-15 | Information processing device, information processing system, information processing method, and storage medium for intrusion detection by applying machine learning to dissimilarity calculations for intrusion alerts |
JP2017524622A JP6753398B2 (ja) | 2015-06-26 | 2016-06-15 | 情報処理装置、情報処理システム、情報処理方法、及び、プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-128768 | 2015-06-26 | ||
JP2015128768 | 2015-06-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016208159A1 true WO2016208159A1 (ja) | 2016-12-29 |
Family
ID=57585401
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/002898 WO2016208159A1 (ja) | 2015-06-26 | 2016-06-15 | 情報処理装置、情報処理システム、情報処理方法、及び、記憶媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US11057399B2 (ja) |
JP (1) | JP6753398B2 (ja) |
WO (1) | WO2016208159A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018109883A (ja) * | 2017-01-05 | 2018-07-12 | 株式会社東芝 | ジョブ実行制御装置、ジョブ実行制御方法およびプログラム |
WO2020148934A1 (ja) * | 2019-01-16 | 2020-07-23 | 株式会社日立製作所 | 分析装置および分析方法 |
WO2020255512A1 (ja) * | 2019-06-21 | 2020-12-24 | 株式会社日立製作所 | 監視システム、および、監視方法 |
JP2021018630A (ja) * | 2019-07-22 | 2021-02-15 | ソフトバンク株式会社 | 警報集約選別装置及び警報集約選別方法 |
US20220060487A1 (en) * | 2019-02-05 | 2022-02-24 | Nec Corporation | Priority determination apparatus, priority determination method, and computer readable medium |
US11507881B2 (en) | 2018-02-22 | 2022-11-22 | Hitachi, Ltd. | Analysis apparatus, analysis method, and analysis program for calculating prediction error and extracting error factor |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10481966B2 (en) * | 2017-05-24 | 2019-11-19 | Vmware, Inc. | Methods and systems to prioritize alerts with quantification of alert impacts |
US11537938B2 (en) * | 2019-02-15 | 2022-12-27 | Wipro Limited | Method and a system for context based clustering of object |
US11756404B2 (en) | 2019-04-08 | 2023-09-12 | Microsoft Technology Licensing, Llc | Adaptive severity functions for alerts |
US11956253B1 (en) * | 2020-06-15 | 2024-04-09 | Exabeam, Inc. | Ranking cybersecurity alerts from multiple sources using machine learning |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005085157A (ja) * | 2003-09-10 | 2005-03-31 | Toshiba Corp | 不正アクセス検出装置、不正アクセス検出方法、および管理端末 |
WO2009110326A1 (ja) * | 2008-03-07 | 2009-09-11 | 日本電気株式会社 | 障害分析装置、障害分析方法および記録媒体 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9525696B2 (en) * | 2000-09-25 | 2016-12-20 | Blue Coat Systems, Inc. | Systems and methods for processing data flows |
US7941855B2 (en) * | 2003-04-14 | 2011-05-10 | New Mexico Technical Research Foundation | Computationally intelligent agents for distributed intrusion detection system and method of practicing same |
US8312023B2 (en) * | 2007-12-21 | 2012-11-13 | Georgetown University | Automated forensic document signatures |
US9258217B2 (en) * | 2008-12-16 | 2016-02-09 | At&T Intellectual Property I, L.P. | Systems and methods for rule-based anomaly detection on IP network flow |
US20120254333A1 (en) * | 2010-01-07 | 2012-10-04 | Rajarathnam Chandramouli | Automated detection of deception in short and multilingual electronic messages |
US8756693B2 (en) * | 2011-04-05 | 2014-06-17 | The United States Of America As Represented By The Secretary Of The Air Force | Malware target recognition |
US9021589B2 (en) * | 2012-06-05 | 2015-04-28 | Los Alamos National Security, Llc | Integrating multiple data sources for malware classification |
US11126720B2 (en) * | 2012-09-26 | 2021-09-21 | Bluvector, Inc. | System and method for automated machine-learning, zero-day malware detection |
US9497204B2 (en) * | 2013-08-30 | 2016-11-15 | Ut-Battelle, Llc | In-situ trainable intrusion detection system |
US9288220B2 (en) * | 2013-11-07 | 2016-03-15 | Cyberpoint International Llc | Methods and systems for malware detection |
US9641542B2 (en) * | 2014-07-21 | 2017-05-02 | Cisco Technology, Inc. | Dynamic tuning of attack detector performance |
EP3292471B1 (en) * | 2015-05-04 | 2021-11-17 | Hasan, Syed Kamran | Method and device for managing security in a computer network |
-
2016
- 2016-06-15 US US15/735,244 patent/US11057399B2/en active Active
- 2016-06-15 JP JP2017524622A patent/JP6753398B2/ja active Active
- 2016-06-15 WO PCT/JP2016/002898 patent/WO2016208159A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005085157A (ja) * | 2003-09-10 | 2005-03-31 | Toshiba Corp | 不正アクセス検出装置、不正アクセス検出方法、および管理端末 |
WO2009110326A1 (ja) * | 2008-03-07 | 2009-09-11 | 日本電気株式会社 | 障害分析装置、障害分析方法および記録媒体 |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018109883A (ja) * | 2017-01-05 | 2018-07-12 | 株式会社東芝 | ジョブ実行制御装置、ジョブ実行制御方法およびプログラム |
US11507881B2 (en) | 2018-02-22 | 2022-11-22 | Hitachi, Ltd. | Analysis apparatus, analysis method, and analysis program for calculating prediction error and extracting error factor |
WO2020148934A1 (ja) * | 2019-01-16 | 2020-07-23 | 株式会社日立製作所 | 分析装置および分析方法 |
JP2020113216A (ja) * | 2019-01-16 | 2020-07-27 | 株式会社日立製作所 | 分析装置および分析方法 |
JP7033560B2 (ja) | 2019-01-16 | 2022-03-10 | 株式会社日立製作所 | 分析装置および分析方法 |
US20220060487A1 (en) * | 2019-02-05 | 2022-02-24 | Nec Corporation | Priority determination apparatus, priority determination method, and computer readable medium |
US11956256B2 (en) * | 2019-02-05 | 2024-04-09 | Nec Corporation | Priority determination apparatus, priority determination method, and computer readable medium |
WO2020255512A1 (ja) * | 2019-06-21 | 2020-12-24 | 株式会社日立製作所 | 監視システム、および、監視方法 |
JP2021018630A (ja) * | 2019-07-22 | 2021-02-15 | ソフトバンク株式会社 | 警報集約選別装置及び警報集約選別方法 |
JP7034989B2 (ja) | 2019-07-22 | 2022-03-14 | ソフトバンク株式会社 | 警報集約選別装置及び警報集約選別方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016208159A1 (ja) | 2018-04-19 |
JP6753398B2 (ja) | 2020-09-09 |
US11057399B2 (en) | 2021-07-06 |
US20180181883A1 (en) | 2018-06-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016208159A1 (ja) | 情報処理装置、情報処理システム、情報処理方法、及び、記憶媒体 | |
US9282112B2 (en) | System and method for determining category of trust of applications performing interface overlay | |
JP6690646B2 (ja) | 情報処理装置、情報処理システム、情報処理方法、及び、プログラム | |
US20210034759A1 (en) | Systems and methods for attributing security vulnerabilities to a configuration of a client device | |
US10789118B2 (en) | Information processing device and error detection method | |
CN112567367A (zh) | 用于聚类和加速多个事故调查的基于相似性的方法 | |
US20170185785A1 (en) | System, method and apparatus for detecting vulnerabilities in electronic devices | |
US10489720B2 (en) | System and method for vendor agnostic automatic supplementary intelligence propagation | |
US20200327237A1 (en) | Systems and methods for aggregating, ranking, and minimizing threats to computer systems based on external vulnerability intelligence | |
JP6523582B2 (ja) | 情報処理装置、情報処理方法及び情報処理プログラム | |
US20140195793A1 (en) | Remotely Establishing Device Platform Integrity | |
US10262122B2 (en) | Analysis apparatus, analysis system, analysis method, and analysis program | |
US10931706B2 (en) | System and method for detecting and identifying a cyber-attack on a network | |
WO2018088383A1 (ja) | セキュリティルール評価装置およびセキュリティルール評価システム | |
JP2020004009A (ja) | 異常検知装置、および、異常検知方法 | |
US20200065482A1 (en) | Evaluation method, information processing apparatus, and storage medium | |
US10129277B1 (en) | Methods for detecting malicious network traffic and devices thereof | |
US10812496B2 (en) | Automatic generation of cluster descriptions | |
US20220201016A1 (en) | Detecting malicious threats via autostart execution point analysis | |
WO2020246011A1 (ja) | ルール生成装置、ルール生成方法、及びコンピュータ読み取り可能な記録媒体 | |
JP2017211806A (ja) | 通信の監視方法、セキュリティ管理システム及びプログラム | |
US20170185772A1 (en) | Information processing system, information processing method, and program | |
US11899793B2 (en) | Information processing apparatus, control method, and program | |
CN111258845A (zh) | 事件风暴的检测 | |
US10063348B2 (en) | Retransmission data processing device, retransmission data communication device, retransmission data communication system, retransmission data processing method, retransmission data communication method, and non-transitory computer readable medium for detecting abnormality by comparing retransmission data to transmission data |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16813933 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017524622 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15735244 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16813933 Country of ref document: EP Kind code of ref document: A1 |