US11893125B2 - Providing a graphical representation of anomalous events - Google Patents
Providing a graphical representation of anomalous events Download PDFInfo
- Publication number
- US11893125B2 US11893125B2 US17/501,617 US202117501617A US11893125B2 US 11893125 B2 US11893125 B2 US 11893125B2 US 202117501617 A US202117501617 A US 202117501617A US 11893125 B2 US11893125 B2 US 11893125B2
- Authority
- US
- United States
- Prior art keywords
- event
- user interface
- graphical user
- events
- anomalous
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000002547 anomalous effect Effects 0.000 title claims abstract description 147
- 230000000007 visual effect Effects 0.000 claims abstract description 19
- 238000000034 method Methods 0.000 claims description 39
- 230000004044 response Effects 0.000 claims description 20
- 230000006399 behavior Effects 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 3
- 238000004458 analytical method Methods 0.000 description 57
- 238000010801 machine learning Methods 0.000 description 20
- 230000008569 process Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 238000012217 deletion Methods 0.000 description 8
- 230000037430 deletion Effects 0.000 description 8
- 238000012549 training Methods 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000013523 data management Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 238000012417 linear regression Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 239000004557 technical material Substances 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/552—Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2101—Auditing as a secondary aspect
Definitions
- a primary system maintains an event log that stores a plurality of entries for a plurality of events.
- the event log may be updated when an object (e.g., file or directory) is accessed, modified, deleted, or created.
- the event log may also be updated for other events associated with the primary system, such as when a user logged in, the number of failed login attempts associated with a client device, each time a software update was performed, each time a password was changed, etc.
- a user may desire to determine whether there has been any anomalous activity at the primary system. However, the number of events stored in the event log may be too voluminous to determine whether there has been any anomalous activity in a timely manner.
- FIG. 1 is a block diagram illustrating a system for providing a graphical representation of anomalous events in accordance with some embodiments.
- FIG. 2 is a flow diagram illustrating a process for providing evidence of anomalous behavior in accordance with some embodiments.
- FIG. 3 is a flow diagram illustrating a process for analyzing an event log in accordance with some embodiments.
- FIG. 4 is a flow diagram illustrating a process for analyzing an event log in accordance with some embodiments.
- FIG. 5 is a flow diagram illustrating a process of training a model in accordance with some embodiments.
- FIGS. 6 A- 6 N are examples of a graphical user interface in accordance with some embodiments.
- FIG. 7 A is an example of a graphical user interface in accordance with some embodiments.
- FIG. 7 B is an example of a graphical user interface in accordance with some embodiments.
- One or more event logs are received and stored at an event analysis system.
- the one or more event logs include a plurality of entries. Each entry corresponds to an event.
- An entry may identify an event type and one or more attributes associated with the event. Examples of event type include a file deletion, a file access, a file creation, a file move, a directory deletion, a directory creation, a directory move, a system login grant, a system login denial, a user being added, a user being deleted, a file being downloaded, a user password change, change of state, change of status, etc.
- event attributes include a timestamp, a sequence number, a user (actor) to which the event is associated, an object with which the event is associated, an internet protocol address, a location from which the event occurred, etc.
- objects include files, databases, virtual machines, applications, containers, volumes, etc.
- the one or more event logs are analyzed by providing each entry as input to a plurality of models that are configured to detect different types of anomalous behavior.
- a model may be configured to determine whether an event or a group of events are indicative of an insider attack, a ransomware attack, a brute force attack, wide access (e.g., log in attempts from different locations), a sensitive data leak, a geo fencing breach, or a combination thereof.
- the output of one or more models is input to a model (e.g., a layered model).
- each of the models is configured to output a corresponding confidence level that indicates whether the one or more events corresponding to the one or more event log entries are anomalous.
- An event by itself or a combination of events may be indicative of anomalous behavior.
- a model may determine that an event is anomalous in the event a confidence level associated with the event is greater than a confidence level threshold.
- the confidence level is based on historical events associated with a particular user.
- the confidence level is based on historical events associated with a system that provided the event log.
- the confidence level is based on historical programmatically generated events associated with an application.
- the confidence level is based on a combination of events (e.g., the confidence level(s) associated with one or more other events may influence the confidence level of an event, a normal event may be determined to be an anomalous event if the event is a threshold event within a time frame.
- the threshold event is determined to be an anomalous event.
- the other events may or may not be determined to be anomalous events).
- Each event is associated with at least one risk entity, such as location, actor, model that determined the event to be anomalous, an object that the actor accessed or attempted to access, etc.
- the relationship between the risk entities associated with an event are determined.
- the relationship between different risk entities associated with different events are determined. For example, the event analysis system determines whether any of the events share risk entities.
- the one or more event logs are provided to a third party system that is configured to analyze the one or more event logs for one or more anomalous events.
- the third party system may provide the analysis to the event analysis system.
- a graphical representation of risk entities associated with one or more detected anomalous events may be generated based on the received analysis.
- a graphical representation of risk entities associated with one or more detected anomalous events is provided to a client device. This enables a user associated with the client device to determine potential risk to a system without having to analyze each specific event log.
- the graphical representation is a visual representation of automatically detected relationships between the risk entities.
- the graphical representation is comprised of a first plurality of graphical user interface items corresponding to the risk entities associated with at least one of the one or more detected anomalous events.
- a first graphical user interface item corresponding to a location associated with a first event may be linked to a second graphical user interface item corresponding to an actor associated with the first event, which is linked to a third graphical user interface item corresponding to a model that determined the first event to be anomalous, which is linked to a fourth graphical user interface item corresponding to an object that the actor associated with the first event accessed or attempted to access.
- the relationship is depicted for anomalous events that occurred during a particular time frame.
- the relationship is depicted for all events that occurred during the particular time frame.
- the particular time frame may be specified via the client device.
- the event analysis system receives a selection of a graphical user interface item corresponding to one of the risk entities.
- the one or more event logs are re-analyzed based on the selected graphical user interface item to identify, if any, one or more other events related to the selected graphical user interface item. This analysis may enable a user associated with a client system to determine whether an event associated with the selected graphical user interface item is an isolated event or part of a group of events that are indicative of anomalous behavior.
- the one or more event logs may include additional event log(s) that were received since the previous analysis.
- the event analysis system re-analyzes the one or more event logs for one or more other events that occurred in the same time frame as the first plurality of graphical user interface items.
- the event analysis system analyzes the one or more event logs for one or more other events that occurred outside the time frame associated with the first plurality of graphical user interface items. In some embodiments, the event analysis system re-analyzes the one or more event logs for one or more other events that occurred in the same time frame as the first plurality of graphical user interface items and analyzes the one or more event logs for one or more other events that occurred outside the time frame associated with the first plurality of graphical user interface items.
- the event log entries associated with the one or more identified events are provided as input to each of the plurality of models. Based on the one or more event log entries, each of the models is configured to output a confidence level that indicates whether an identified event is anomalous. An identified event is determined to be anomalous in the event its corresponding confidence level is greater than a confidence level threshold. In some embodiments, a group of events is determined to be anomalous (e.g., an anomalous incident) in the event a model outputs a confidence level that is greater than a confidence level threshold for the group of events. For example, the model may count the number of times that a particular type of event occurs within a particular period of time. The model may determine that a group of events of a particular type is anomalous after the number of occurrences of the particular type of event is greater than an event threshold. The events included in the group may be from the same event log and/or one or more other event logs.
- Each anomalous event is associated with at least one of risk entity, such as location, actor, model that determined the event to be anomalous, an object that the actor accessed or attempted to access, etc.
- risk entity such as location, actor, model that determined the event to be anomalous, an object that the actor accessed or attempted to access, etc.
- the relationships between the risk entities associated with the anomalous event are determined.
- the relationships between different entities associated with different anomalous events are determined. For example, the event analysis system determines whether any of the anomalous events share risk entities.
- the graphical representation of risk entities associated with one or more detected anomalous events is updated.
- the updated graphical representation is comprised of a second plurality of graphical user interface items. Similar to the first plurality of graphical user items, a graphical user interface item may correspond to a risk entity, such as a location, an actor, a model that determined an event to be anomalous, or an object that the actor accessed or attempted to access.
- the graphical representation may include indications of a measure of an anomaly for an event that are based on a confidence level outputted by one of the models. The measure of anomaly is for the particular period of time associated with the second plurality of graphical user interface items.
- the measure of anomaly for an event is reflected in risk entities associated with the event.
- the reflected measure of anomaly may be color-coded. This indicates the seriousness of a detected event.
- the graphical representation depicts the relationship between the risk entity corresponding to the selected graphical user interface item and the plurality of risk entities associated with one or more anomalous events.
- FIG. 1 is a block diagram illustrating a system for providing a graphical representation of anomalous events in accordance with some embodiments.
- system 100 includes primary systems 102 a . . . 102 n , event analysis system 112 , and one or more client devices 122 .
- Primary systems 102 a . . . 102 n may be a server, a virtual machine running on a computing device, a database running on a computing device, or any computing device that is capable of generating an event log.
- FIG. 1 depicts two primary systems, system 100 may include 1:n primary systems.
- primary systems 102 a . . . 102 n are associated with a single tenant.
- a tenant may correspond to a user, an enterprise, a government, a company, an organization, etc.
- primary systems 102 a . . . 102 n are associated with a plurality of different tenants.
- event analysis system 112 is coupled to a plurality of different tenants, each tenant being associated with one or more corresponding primary systems.
- Primary systems 102 a . . . 102 n include one or more corresponding event logs 104 a . . . 104 n that are each comprised of a plurality of entries.
- An event log may be generated by an application (e.g., collaboration application, productivity application, database application, etc.) hosted by the primary system.
- an event log is generated by an operating system, firmware, a firewall, etc.
- Each entry of an event log identifies an event type and one or more attributes associated with the event.
- Examples of event type include a file deletion, a file access, a file creation, a file move, a directory deletion, a directory creation, a directory move, a system login grant, a system login denial, a user being added, a user being deleted, a file being downloaded, a user password change, etc.
- Examples of event attributes include a timestamp, a sequence number, a user to which the event is associated, an object with which the event is associated, an internet protocol address, a location from which the event occurred, etc.
- An object may be included in a data pool.
- a data pool is a description of one or more objects to be included in the data pool when one or more data management services (e.g., backup, restore, migration, replication, tiering, disaster recovery, etc.) are performed.
- Event analysis system 112 is coupled to primary systems 102 a . . . 102 n .
- Event analysis system 112 may be a server, a computing cluster comprised of a plurality of computing nodes, a virtual machine running on a computing device (e.g., a computer), a containerized application running on one or more computing devices, a cloud computing device, etc.
- Event analysis system 112 includes event log reader 111 .
- event log reader 111 sends a corresponding request to primary systems 102 a . . . 102 n for an event log.
- the primary systems 102 a . . . 102 n provide the requested event log.
- primary systems 102 a . . . 102 n send (continually or periodically) a corresponding set of one or more events included in a corresponding event log 104 a . . . 104 n to event log reader 111 .
- event log reader 111 remotely accesses an event log stored on primary systems 102 a . . . 102 n.
- Event log reader 111 stores the obtained event logs in event log store 113 .
- Event log store 113 may be stored in a memory or a storage device associated with event analysis system 112 .
- Event log store 113 is coupled to anomalous event detector 115 .
- Anomalous event detector 115 is comprised of a plurality of models. Anomalous event detector 115 obtains the one or more entries corresponding to one or more events included in the event logs as input to the plurality of models.
- the plurality of models may be configured to perform analysis according to a schedule (e.g., daily).
- a first sub-set of the models is configured to perform analysis according to a first schedule (e.g., daily) and a second sub-set of the models is configured to perform analysis according to a second schedule (e.g., weekly).
- Each of the models is configured to determine whether an event is anomalous. The anomalous event by itself or a group of events may be indicative of anomalous behavior.
- a model may be configured to analyze event log entries that correspond to events that occurred since a last time the model analyzed event log entries.
- a model is streamed log entries and analyzes the log entries as they are received.
- event analysis system 112 provides to external device 122 a notification of an anomalous event being detected.
- event analysis system 112 may receive from external device 122 an indication of a time frame in which one or more models analyze analyzes event log events for anomalous events.
- the plurality of models are configured to detect different types of anomalous behavior.
- a model may be configured to determine whether an event or a group of events are indicative of an insider attack, a ransomware attack, a brute force attack, wide access (e.g., log in attempts from different locations), a sensitive data leak, a geo fencing breach, or a combination thereof.
- a model is configured to detect specific types of anomalous events. For example, a first model configured to detect insider attack events may determine that a first event is indicative of an insider attack and a second model configured to detect ransomware events may determine that a second event is indicative of a ransomware attack. The second model may determine that the first event is not indicative of a ransomware attack.
- a model may be a rules-based model, a machine learning model, a deterministic-based model, a heuristic-based model, etc. Based on the one or more event log entries obtained from event log store 113 , each of the models is configured to output a corresponding confidence level that indicates whether one or more events corresponding to the one or more event log entries are anomalous.
- a model may determine that an event is an anomalous event in the event a confidence level outputted by the model is greater than a confidence level threshold.
- the confidence level is based on historical events associated with a particular user.
- the confidence level is based on historical events associated with a system that provided the event log.
- the confidence level is based on historical programmatic events associated with an application.
- the confidence level is based on a combination of events.
- none of the plurality of models' output determine any of the events to be anomalous. In some embodiments, one of the plurality of models outputs a corresponding confidence level that indicates one or more events are anomalous. In some embodiments, at least two of the plurality of models output a corresponding confidence level that indicates one or more events are anomalous.
- event analysis system 112 is configured to generate a graphical representation of risk entities associated with at least one of the one or more anomalous events.
- risk entities include a location, an actor, a model associated with the event, or an object that the actor accessed or attempted to access.
- the graphical representation is comprised of a first plurality of graphical user interface items corresponding to the risk entities associated with at least one of the one or more anomalous events.
- the graphical representation is a visual representation of automatically detected relationships between the risk entities.
- a first graphical user interface item corresponding to a location may be linked to a second graphical user interface item corresponding to an actor, which is linked to a third graphical user interface item corresponding to a model associated with the anomalous event, which is linked to an object that the actor accessed or attempted to access.
- the relationship is depicted for an event or a group of events that occurred during the particular time frame and having a confidence level greater than a confidence level threshold. In some embodiments, the relationship is depicted for all events or a group of events that occurred during the particular time frame.
- the graphical representation may include indications of a measure of anomaly for an event, which may be based on a confidence level outputted by one of the models.
- the measure of anomaly for an event is reflected in risk entities associated with the event.
- the reflected measure of anomaly may be color-coded. For example, a graphical user interface item corresponding to an entity associated with a high-risk event may be in red, a graphical user interface item corresponding to an entity associated with a medium-risk event may be in orange, and a graphical user interface item corresponding to an entity associated with a low-risk event may be in green.
- the measure of anomaly for an event may be based on a confidence level outputted by one of the models.
- the measure of anomaly for an event is reflected in risk entities associated with the event.
- a risk entity may be associated with a high risk event in the event a confidence level of an event associated with the risk entity is greater than a first risk threshold.
- a risk entity may be associated with a medium risk event in the event a confidence level of an event associated with the risk entity is less than or equal to the first risk threshold and greater than a second risk threshold.
- a risk entity may be associated with a low risk event in the event a confidence level of an event associated with the risk entity is less than or equal to the second risk threshold and greater than a risk lower limit.
- An event is determined to be anomalous in the event the confidence level is at least greater than the risk lower limit.
- Event analysis system 112 is configured to provide the graphical representation to one or more client devices 122 via graphical user interface 117 .
- a client device may be a computer, a desktop, a laptop, a tablet, a server, a smart device, etc.
- Event analysis system 112 is configured to receive one or more inputs from the one or more client devices 122 via graphical user interface 117 .
- an input causes the graphical representation to be shared with another user. Once shared, the other user may collaborate on the graphical representation via graphical user interface 117 .
- an input enables a chat window to be associated with graphical user interface 117 .
- an input enables a comment to be associated with a graphical user interface item.
- an input enables a graphical user interface item to be ignored.
- an input causes a report of the detected anomalous behavior to be generated.
- an input selects one or more of the graphical user interface items.
- anomalous event detector 115 is configured to determine one or more events associated with the selected graphical user interface item(s) that may be indicative of anomalous behavior.
- Anomalous event detector 115 is configured to re-analyze the one or more event logs stored in event log store 113 to identify one or more events associated with the selected graphical user interface item(s).
- anomalous event detector 115 re-analyzes one or more event logs stored in event log store 113 to identify, if any, one or more other events associated with the location.
- a graphical user interface item corresponding to an actor may be selected.
- anomalous event detector 115 re-analyzes one or more event logs stored in event log store 113 to identify, if any, one or more other events associated with the actor.
- a graphical user interface item corresponding to a model may be selected.
- anomalous event detector 115 re-analyzes one or more event logs stored in event log store 113 to identify, if any, one or more other events associated with the model.
- a graphical user interface item corresponding to an object that the actor accessed or attempted to access may be selected.
- anomalous event detector 115 re-analyzes one or more event logs stored in event log store 113 to identify, if any, one or more other events associated with the object corresponding to the selected graphical user interface item.
- anomalous event detector 115 re-analyzes one or more event logs stored in event log store 113 for one or more events that occurred in the same time frame as the first plurality of graphical user interface items. In some embodiments, anomalous event detector 115 analyzes event log store 113 for one or more events that occurred outside the time frame associated with the first plurality of graphical user interface items. In some embodiments, anomalous event detector 115 re-analyzes event log store 113 for one or more events that occurred in the same time frame as the first plurality of graphical user interface items and for one or more events that occurred outside the time frame associated with the first plurality of graphical user interface items.
- An individual identified event or a combination of identified events are provided as input to the plurality of models associated with anomalous event detector 115 .
- Each of the models is configured to output a confidence level that indicates whether an identified event or a combination of identified events are indicative of anomalous behavior.
- event analysis system 112 is configured to update the graphical user interface 117 .
- graphical user interface 117 is updated to provide a visual representation of a relationship between the selected graphical user interface item and risk entities associated with all of the identified events.
- graphical user interface 117 is updated to provide a visual representation of a relationship between the selected graphical user interface item and risk entities associated with identified events having a confidence level that is greater than the confidence threshold.
- the graphical representation provides a visual representation of detected relationships of risk entities that were not discovered in the initial analysis.
- the graphical representation is comprised of a second plurality of graphical user interface items.
- a graphical user interface item may correspond to a risk entity, such as a location, an actor, a model that determined the event to be anomalous, or an object that the actor accessed or attempted to access.
- the graphical representation may include indications of a measure of anomaly associated with a plurality of events. The measures of anomaly associated with the events are reflected in the risk entities associated with the events. The reflected measure of anomaly may be color-coded.
- the graphical representation depicts the relationship between the entity corresponding to the selected graphical user interface item and the plurality of entities associated with one or more identified events or an identified group of events that occurred during a specified time frame.
- the graphical representation may be updated to depict the relationship between the location associated with the selected graphical user interface item, an actor associated with an identified event, a model associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the actor associated with the selected graphical user interface item, a location associated with an identified event, a model associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the model associated with the selected graphical user interface item, a location associated with an identified event, an actor associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the object associated with the selected graphical user interface item, a location associated with an identified event, an actor associated with the identified event, and a model associated with the identified event.
- FIG. 2 is a flow diagram illustrating a process for providing evidence of anomalous behavior in accordance with some embodiments.
- process 200 may be implemented by an event analysis system, such as event analysis system 112 .
- the one or more event logs include a plurality of entries. Each entry corresponds to an event.
- An entry may identify an event type and one or more attributes associated with the event. Examples of event type include a file deletion, a file access, a file creation, a file move, a directory deletion, a directory creation, a directory move, a system login grant, a system login denial, a user being added, a user being deleted, a file being downloaded, a user password change, etc.
- event attributes include a timestamp, a sequence number, a user to which the event is associated, an object with which the event is associated, an internet protocol address, a location from which the event occurred, etc.
- an event analysis system remotely accesses an event log stored on a source system.
- the one or more event logs are stored.
- An event analysis system includes a plurality of models that are configured to detect different types of anomalous behavior.
- a model may be configured to determine whether an event or a group of events are indicative of an insider attack, a ransomware attack, a brute force attack, wide access (e.g., log in attempts from different locations), a sensitive data leak, a geo fencing breach, or a combination thereof.
- the entries of the one or more event logs are provided as input to each of the plurality of models.
- a model may be a rules-based model, a machine learning model, a deterministic-based model, a heuristic-based model, etc.
- each of the models is configured to output a corresponding confidence level that indicates whether the one or more events corresponding to the one or more log entries are anomalous.
- a model may determine that an event is anomalous in the event a confidence level associated with the event is greater than a confidence level threshold.
- the confidence level is based on historical events associated with a particular user.
- the confidence level is based on historical events associated with a system that provided the event log.
- Each event is associated with at least one risk entity, such as location, actor, model that determined the event to be indicative of anomalous behavior, an object that the actor accessed or attempted to access, etc.
- the relationship between the risk entities associated with an event may be determined in part from an event log entry associated with the event.
- the relationship between different risk entities associated with different events are determined. For example, the event analysis system determines whether any of the events share risk entities.
- a graphical representation of risk entities associated with one or more detected anomalous events is provided.
- the graphical representation may be provided for one or more events having a corresponding confidence level greater than a confidence threshold.
- the graphical representation is comprised of a first plurality of graphical user interface items corresponding to the risk entities associated with at least one of the one or more detected anomalous events.
- the graphical representation is a visual representation of automatically detected relationships between the risk entities. For example, a first graphical user interface item corresponding to a location may be linked to a second graphical user interface item corresponding to an actor, which is linked to a third graphical user interface item corresponding to a model that determined the event to be anomalous, which is linked to an object that the actor accessed or attempted to access.
- the relationship is depicted for anomalous events that occurred during a particular time frame. In some embodiments, the relationship is depicted for all events that occurred during the particular time frame.
- the particular time frame may be specified by the client device.
- a selection of a graphical user interface item corresponding to one of the risk entities is received.
- the one or more event logs are re-analyzed based on the selected graphical user interface item.
- the one or more event logs may include one or more other events associated with the selected user interface item.
- the one or more event logs may include additional event log(s) that were received since the previous analysis.
- the event analysis system re-analyzes the one or more event logs to identify the one or more other events, if any, associated with the selected user interface item.
- the event analysis system re-analyzes the one or more event logs for one or more events occurring in the same time frame as the first plurality of graphical user interface items. In some embodiments, the event analysis system re-analyzes the one or more event logs for one or more events that occurred outside the time frame associated with the first plurality of graphical user interface items. In some embodiments, the event analysis system re-analyzes the one or more event logs for one or more events that occurred in the same time frame as the first plurality of graphical user interface items and analyzes the one or more event logs for one or more events that occurred outside the time frame associated with the first plurality of graphical user interface items.
- a graphical user interface item corresponding to a location may be selected.
- the event analysis system re-analyzes one or more event logs to identify, if any, one or more other events associated with the location. At least one of the one or more identified events associated with the location may have a corresponding confidence level outputted by one or more of the models that is greater than the confidence level threshold.
- a graphical user interface item corresponding to an actor may be selected.
- the event analysis system re-analyzes one or more event logs to identify, if any, one or more other events associated with the actor. At least one of the one or more identified events associated with the actor may have a corresponding confidence level outputted by one or more of the models that is greater than the confidence level threshold.
- a graphical user interface item corresponding to a model may be selected.
- the event analysis system re-analyzes one or more event logs stored to identify, if any, one or more other events associated with the model. At least one of the one or more identified events associated with the model may have a corresponding confidence level outputted by the model corresponding to the selected graphical user interface item that is greater than the confidence level threshold.
- a graphical user interface item corresponding to an object that the actor accessed or attempted to access may be selected.
- the event analysis system re-analyzes one or more event logs to identify, if any, one or more other events associated with the object corresponding to the selected graphical user interface item.
- At least one of the one or more identified events associated with the object corresponding to the selected graphical user interface item may have a corresponding confidence level outputted by one or more of the models that is greater than the confidence level threshold.
- the graphical representation of risk entities associated with one or more detected anomalous events is updated.
- the graphical representation is comprised of a second plurality of graphical user interface items.
- a graphical user interface item may correspond to a risk entity, such as a location, an actor, a model that detected an anomalous event associated with the event or group of events, or an object that the actor accessed or attempted to access.
- the graphical representation may include indications of a measure of anomaly associated with the events. The measure of anomaly is reflected in risk entities associated with the event, which may be color-coded.
- the graphical representation depicts the relationship between the entity corresponding to the selected graphical user interface item and the plurality of entities associated with one or more identified events. The relationship may be depicted for all of the one or more identified events. In some embodiments, the relationship is depicted for one or more identified events having a corresponding confidence level greater than a confidence threshold level.
- the graphical representation may be updated to depict the relationship between the location associated with the selected graphical user interface item, an actor associated with an identified event, a model associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the actor associated with the selected graphical user interface item, a location associated with an identified event, a model associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the model associated with the selected graphical user interface item, a location associated with an identified event, an actor associated with the identified event, and an object that the actor associated with the identified event accessed or attempted to access.
- the graphical representation may be updated to depict the relationship between the object associated with the selected graphical user interface item, a location associated with an identified event, the actor associated with the identified event, and a model associated with the identified event.
- FIG. 3 is a flow diagram illustrating a process for analyzing an event log in accordance with some embodiments.
- process 300 may be implemented by an event analysis system, such as event analysis system 112 .
- process 300 is implemented to perform some or all of step 206 of process 200 .
- events included in an event log for a particular time frame are provided as input to a plurality of models.
- a first event may be provided as input to a first model, provided as input to a second model, . . . , and provided as input to an nth model.
- a second event may be provided as input to the first model, provided as input to the second model, . . . , and provided as input to an nth model.
- An nth event may be provided as input to the first model, provided as input to the second model, . . . , and provided as input to the nth model.
- the events are provided as input to a subset of the plurality of models.
- At 304 it is determined whether any of the events are indicative of anomalous behavior based on a corresponding confidence level outputted by each of the plurality of models.
- An event may be determined to be anomalous in the event a confidence level outputted by one of the models is greater than a confidence level threshold.
- relationships between risk entities are determined for each of the one or more events determined to have a confidence level greater than a confidence threshold.
- risk entities include a location, an actor, a model that detected an anomalous event associated with the event or group of events, or an object that the actor accessed or attempted to access.
- the relationship may link a location associated with an event to an actor associated with the event, a model that determined the event to be anomalous, and an object associated with the event that was accessed or attempted to be accessed.
- relationships between different risk entities associated with different determined events are determined.
- the determined relationships may indicate how each of the determined events relate to each other.
- the event analysis system may determine that a first event and a second event share the same location and actor, but the first event was detected by a first model and a first object was accessed or attempted to be accessed, and the second event was detected by a second model and a second object was accessed or attempted to be accessed.
- the graphical representation of the anomalous events may be depicted as a web of interconnected risk entities.
- the measure of anomaly for the same risk entity of the different type corresponds to the highest confidence level among the plurality of risk entities of the first type.
- a plurality of actors may be associated with events provided as input to a model. Each of the events is associated with a corresponding confidence level. The measure of anomaly for the model corresponds to the highest confidence level of the plurality of confidence levels.
- FIG. 4 is a flow diagram illustrating a process for analyzing an event log in accordance with some embodiments.
- process 400 may be implemented by an event analysis system, such as event analysis system 112 .
- process 400 is implemented to perform some or all of step 212 of process 200 .
- one or more event logs are re-analyzed to identify one or more events associated with a selected graphical user interface item.
- the selected graphical user interface item may correspond to a location.
- the one or more event logs may be re-analyzed to identify one or more other events that are associated with the location.
- the selected graphical user interface item may correspond to an actor.
- the one or more event logs may be re-analyzed to identify one or more other events that associated with the actor.
- the selected graphical user interface item may correspond to a model that detected the anomalous behavior.
- the one or more logs may be re-analyzed to identify one or more other events determined by the model to have a confidence level greater than a confidence level threshold.
- the selected graphical user interface item may correspond to an object that an actor accessed or attempted to accessed.
- the one or more logs may be re-analyzed to identify one or more other events associated with the object.
- the one or more identified events are provided as input to each of a plurality of models.
- the one or more identified events having a confidence level greater than a confidence threshold are determined.
- Each of the plurality of models is configured to output a corresponding confidence level for each of the one or more identified events.
- relationships between risk entities are determined for each of the one or more determined events.
- risk entities include a location, an actor, a model that determined the event to be anomalous, or an object that the actor accessed or attempted to access.
- the selected graphical user interface item corresponds to a location
- a relationship between the location and an actor associated with a first determined event (who performed the event), a model associated with the first determined event (which model detected the anomalous behavior), and an object associated with the first determined event is determined (what object did the actor access or attempt to access).
- the selected graphical user interface item corresponds to an actor
- a relationship between the actor and a location associated with a second determined event (where did the actor perform the event)
- a model associated with the second determined event (which model detected the anomalous behavior)
- an object associated with the second determined event is determined.
- the selected graphical user interface item corresponds to a model that determined an event to be anomalous
- a relationship between the model and a location associated with a third determined event, an actor associated with the third determined event, and an object associated with the third determined event is determined.
- the selected graphical user interface item corresponds to an object that was accessed or attempted to be accessed
- a relationship between the object and a location associated with a fourth determined event, an actor associated with the fourth determined event, and a model associated with the fourth determined event is determined.
- relationships between different risk entities associated with different determined events are determined.
- the determined relationships may indicate how each of the determined events relate to the selected graphical user interface item.
- the selected graphical user interface item may correspond to a location.
- the event analysis system may determine whether any of the determined events share an actor, a model, or an object that was accessed or attempted to be accessed.
- the event analysis system determines that a plurality of actors associated with the same location accessed or attempted to access the same object.
- the selected graphical user interface item may correspond to an actor.
- the event analysis system may determine whether any of the determined events share a location, a model, or an object that was accessed or attempted to be accessed.
- the selected graphical user interface item may correspond to a model.
- the event analysis system may determine whether any of the determined events share a location, a model, or an object that was accessed or attempted to be accessed.
- the selected graphical user interface item may correspond to an object that was accessed or attempted to be accessed.
- the event analysis system may determine whether any of the determined events share a location, an actor, or a model.
- the graphical representation of the anomalous events may be depicted as a web of interconnected risk entities.
- the measure of anomaly for the same risk entity of the different type corresponds to the highest confidence level among the plurality of risk entities of the first type.
- FIG. 5 is a flow diagram illustrating a process of training a model in accordance with some embodiments.
- process 500 may be implemented by an event analysis system, such as event analysis system 112 .
- Each event log is comprised of a plurality of entries. Each entry corresponds to an event.
- Each entry of an event log identifies an event type and one or more attributes associated with the event. Examples of event type include a file deletion, a file access, a file creation, a file move, a directory deletion, a directory creation, a directory move, a system login grant, a system login denial, a user being added, a user being deleted, a file being downloaded, a user password change, change of state, change of status etc.
- event attributes include a timestamp, a sequence number, a user to which the event is associated, an object with which the event is associated, an internet protocol address, a location from which the event occurred, etc.
- a machine learning model is trained based on the one or more event logs.
- the machine learning model is trained to identify a particular type of anomalous behavior. For example, a machine learning model may be trained to determine whether an event or a group of events are indicative of an insider attack, a ransomware attack, a brute force attack, wide access (e.g., log in attempts from different locations), a sensitive data leak, a geo fencing breach, or a combination thereof.
- the machine learning model may be a supervised learning machine learning model, a semi-supervised machine learning model, an unsupervised machine learning model, or a reinforcement machine learning model.
- machine learning model algorithms include a Na ⁇ ve Bayes classifier algorithm, K Means clustering algorithm, support vector machine algorithm, linear regression, logistic regression, artificial neural networks, decision trees, random forests, nearest neighbors, etc.
- the entries included in the one or more event logs are sorted into training data and validation data.
- the entries included in the training data are applied to a machine learning model.
- the model is trained using the training data until it has a prediction accuracy greater than a threshold accuracy.
- the entries included in the validation data are applied to a trained machine learning model.
- the trained machine learning model has a prediction accuracy above the threshold, the trained machine learning model is validated and ready for use with production data (e.g., event logs from primary systems 102 a , 102 n ). Otherwise, the machine learning model is retrained and revalidated to produce a more accurate machine learning model.
- a plurality of events included in one or more event logs are applied to the machine learning model.
- the machine learning model outputs a corresponding confidence level that indicates whether one or more events corresponding to the one or more event log entries are anomalous.
- a model may determine that an event is anomalous in the event a confidence level associated with the event is greater than a confidence level threshold.
- a graphical user interface that provides a graphical representation of an event determined to be an anomalous event is provided to a client device.
- a user associated with the client device may provide, via the graphical user interface, feedback regarding a risk entity associated with an anomalous event.
- the graphical representation may indicate a risk entity is a medium-risk entity.
- the received feedback e.g., clicking ignore on the graphical user interface item corresponding to the risk entity
- the machine learning model is retrained based on the feedback.
- FIGS. 6 A- 6 N are examples of a graphical user interface in accordance with some embodiments.
- Graphical user interface 600 provides a graphical representation of risk entities associated with four different events occurring within a time frame 624 .
- a visual representation of a first event depicts the relationship between the risk entities associated with the first event.
- Graphical user interface item 602 corresponding to a location of “San Francisco, CA” is linked to graphical user interface item 608 corresponding to an “unknown” actor, which is linked to a graphical user interface item 612 corresponding to a model that detected an event to be anomalous, which is linked to a graphical user interface item 618 corresponding to an object accessed or attempted to be accessed by the “unknown” actor.
- a visual representation of a second event includes graphical user interface item 604 corresponding to a location of “Pyongyang, North Korea” linked to graphical user interface item 606 corresponding to an actor named “Sen-yun,” which is linked to a graphical user interface item 614 corresponding to a model that detected an event to be anomalous, which is linked to graphical user interface item 616 corresponding to an object accessed or attempted to be accessed by the actor named “Sen-yun.”
- a visual representation of a third event includes graphical user interface item 604 corresponding to a location of “Pyongyang, North Korea” linked to graphical user interface item 606 corresponding to an actor named “Sen-yun,” which is linked to graphical user interface item 614 corresponding to a model that detected an event to be anomalous, which is linked to graphical user interface item 622 corresponding to an object accessed or attempted to be accessed by the actor named “Sen-yun.”
- a visual representation of a fourth event includes graphical user interface item 607 corresponding to an actor named “Ji-yoo” linked to graphical user interface item 612 corresponding to a model that detected an event to be anomalous, which is linked to graphical user interface item 618 corresponding to an object accessed or attempted to be accessed by the actor named “Ji-yoo.”
- Each of the graphical user interface items has a corresponding measure of anomaly.
- graphical user interface items 604 , 606 , 614 , 622 have a red header.
- the measure of anomaly for the same risk entity of the different type corresponds to the highest confidence level among the plurality of risk entities of the first type. This indicates that the risk entities are high-risk entities.
- Graphical user interface item 616 has an orange header. This indicates that the risk entity is a medium-risk entity.
- Graphical user interface items 607 , 612 , 618 have a green header. This indicates that the risk entities are low-risk entities.
- Graphical user interface item 602 has a gray header.
- Graphical user interface item 608 has a blue header. This indicates that the risk entity is neutral.
- FIG. 6 B illustrates a selection of graphical user interface item 604 .
- a user may provide an input to graphical user interface 605 , such as right-clicking on graphical user interface item 604 .
- graphical user interface 605 provides a menu 626 having an option to “show related evidence,” an option to “add to incident report,” and an option to “remove card.”
- FIG. 6 C illustrates a selection of “show related evidence.”
- graphical user interface 605 is updated to be graphical user interface 610 .
- the event analysis system analyzes one or more event logs to identify, if any, related to the location of “Pyongyang.”
- Graphical user interface 610 depicts the relationship between a location of “Pyongyang” and the identified events.
- a visual representation of a first related event depicts the relationship between the risk entities associated with the first related event.
- Graphical user interface item 604 corresponding to a location of “Pyongyang” is linked to graphical user interface item 628 corresponding to an actor named “Sen-yoo,” which is linked to a graphical user interface item 632 corresponding to a model that detected an event to be anomalous, which is linked to a graphical user interface item 634 corresponding to an object accessed or attempted to be accessed by the actor named “Sen-yoo.”
- a visual representation of a second related event depicts the relationship between the risk entities associated with the second related event.
- Graphical user interface item 604 corresponding to a location of “Pyongyang” is linked to graphical user interface item 628 corresponding to an actor named “Sen-yoo,” which is linked to a graphical user interface item 632 corresponding to a model that detected an event to be anomalous, which is linked to a graphical user interface item 636 corresponding to an object accessed or attempted to be accessed by the actor named “Sen-yoo.”
- a visual representation of a third related event depicts the relationship between the risk entities associated with the third related event.
- Graphical user interface item 604 corresponding to a location of “Pyongyang” is linked to graphical user interface item 628 corresponding to an actor named “Sen-yoo,” which is linked to a graphical user interface item 632 corresponding to a model that detected an event to be anomalous, which is linked to a graphical user interface item 638 corresponding to an object accessed or attempted to be accessed by the actor named “Sen-yoo.”
- Graphical user interface 610 also depicts the relationships between different risk entities associated with different determined events.
- the first, second, and third related events share the same location, actor, and model that determined the events associated with the actor to be anomalous.
- FIG. 6 D illustrates a graphical user interface 615 where the risk entities associated with the relevant event are being added to a database of high-risk entities. Risk entities are being marked as evidence of anomalous events. This is illustrated by fingerprint icons 642 , 644 , 646 , 648 . These risk entities will remain in the graphical user interface when parameters, such as time frame 624 , are changed.
- FIG. 6 E illustrates user 654 may share graphical user interface 620 via share button 652 .
- an overlay page 656 is depicted in graphical user interface 625 .
- a user may specify contact information (e.g., email address, phone number, etc.) of another user and a description of the evidence board.
- Overlay page 656 includes a create button 658 .
- a link associated with the evidence board is provided to the contact information provided in overlay page 656 .
- the evidence board has been shared with user 662 , which may collaborate with user 654 .
- FIG. 6 H depicts graphical user interface 635 having a chat button 664 .
- chat button 664 may communicate via chat window 666 .
- FIG. 6 J depicts graphical user interface 645 having a comment button 668 .
- a user may leave a comment for any of the graphical user interface items.
- a user has made comments 672 , 674 , 676 .
- the graphical representation of risk entities associated with a detected anomalous event is reviewed by a user associated with a client device.
- the user may review the risk entities and determine whether to ignore an anomalous event.
- Each of the graphical user interface items 602 , 604 , 606 , 607 , 608 , 612 , 614 , 616 , 618 , 622 may be ignored by the user.
- the user is selecting graphical user interface item 678 to ignore graphical user interface item 607 .
- graphical user interface item 607 has been ignored 682 by the user.
- GUI 665 illustrates these graphical user interface items being ignored by changing “Ignore” on the graphical user interface item to be “Ignored.”
- the “Ignored” text 682 , 684 , 686 , 688 is highlighted in white to indicate that a graphical user interface item has been ignored.
- This feedback may be used by the event analysis system to re-train one or more of its models.
- FIG. 7 A illustrates an example of a graphical user interface in accordance with some embodiments.
- graphical user interface 700 illustrates a report that summarizes and describes one of the events detected by the event analysis system.
- the report may include a summary section that includes a description 704 of the event.
- the report may be automatically generated based on risk entities that are marked as evidence, for example, risk entities 642 , 644 , 646 , 648 as seen in FIG. 6 D .
- Information associated with the risk entities is included in the report. Notes may be added to risk entities 642 , 644 , 646 , 648 and be included in the report.
- the report may include sections corresponding to the different risk entities.
- a section may include a description associated with a risk entity 706 , 712 , access logs 708 , 714 that indicate the event is anomalous, and notes 710 associated with the risk entity.
- Graphical user interface 700 includes a schedule button 702 .
- an overlay page 752 is provided to the user via graphical user interface 750 .
- the overlay page may enable the user to name the report, schedule the report, add contact information of recipients of the report, and an export format.
- the invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor.
- these implementations, or any other form that the invention may take, may be referred to as techniques.
- the order of the steps of disclosed processes may be altered within the scope of the invention.
- a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task.
- the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Debugging And Monitoring (AREA)
Abstract
Description
Claims (20)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/501,617 US11893125B2 (en) | 2021-10-14 | 2021-10-14 | Providing a graphical representation of anomalous events |
US18/537,095 US20240126910A1 (en) | 2021-10-14 | 2023-12-12 | Providing a graphical representation of anomalous events |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/501,617 US11893125B2 (en) | 2021-10-14 | 2021-10-14 | Providing a graphical representation of anomalous events |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/537,095 Continuation US20240126910A1 (en) | 2021-10-14 | 2023-12-12 | Providing a graphical representation of anomalous events |
Publications (2)
Publication Number | Publication Date |
---|---|
US20230117120A1 US20230117120A1 (en) | 2023-04-20 |
US11893125B2 true US11893125B2 (en) | 2024-02-06 |
Family
ID=85982499
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/501,617 Active 2042-01-02 US11893125B2 (en) | 2021-10-14 | 2021-10-14 | Providing a graphical representation of anomalous events |
US18/537,095 Pending US20240126910A1 (en) | 2021-10-14 | 2023-12-12 | Providing a graphical representation of anomalous events |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/537,095 Pending US20240126910A1 (en) | 2021-10-14 | 2023-12-12 | Providing a graphical representation of anomalous events |
Country Status (1)
Country | Link |
---|---|
US (2) | US11893125B2 (en) |
Citations (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030220940A1 (en) * | 2002-04-15 | 2003-11-27 | Core Sdi, Incorporated | Secure auditing of information systems |
US9009825B1 (en) * | 2013-06-21 | 2015-04-14 | Trend Micro Incorporated | Anomaly detector for computer networks |
US9519698B1 (en) * | 2016-01-20 | 2016-12-13 | International Business Machines Corporation | Visualization of graphical representations of log files |
US9537880B1 (en) * | 2015-08-19 | 2017-01-03 | Palantir Technologies Inc. | Anomalous network monitoring, user behavior detection and database system |
US20170060656A1 (en) * | 2015-08-31 | 2017-03-02 | Microsoft Technology Licensing, Llc | Predicting service issues by detecting anomalies in event signal |
US9697100B2 (en) * | 2014-03-10 | 2017-07-04 | Accenture Global Services Limited | Event correlation |
US20170193239A1 (en) * | 2015-12-30 | 2017-07-06 | International Business Machines Corporation | Data-centric monitoring of compliance of distributed applications |
US20170192872A1 (en) * | 2014-12-11 | 2017-07-06 | Hewlett Packard Enterprise Development Lp | Interactive detection of system anomalies |
US20170230229A1 (en) * | 2013-07-28 | 2017-08-10 | Opsclarity, Inc. | Ranking network anomalies in an anomaly cluster |
US20180167402A1 (en) * | 2015-05-05 | 2018-06-14 | Balabit S.A. | Computer-implemented method for determining computer system security threats, security operations center system and computer program product |
US20180270265A1 (en) * | 2016-05-13 | 2018-09-20 | Ola Sage | System and Method For Assessing Network Security Risks |
US20190052660A1 (en) * | 2016-02-05 | 2019-02-14 | Defensestorm, Inc. | Enterprise policy tracking with security incident integration |
US20190228296A1 (en) * | 2018-01-19 | 2019-07-25 | EMC IP Holding Company LLC | Significant events identifier for outlier root cause investigation |
US20190342307A1 (en) * | 2018-05-01 | 2019-11-07 | Royal Bank Of Canada | System and method for monitoring security attack chains |
US10841338B1 (en) * | 2017-04-05 | 2020-11-17 | Exabeam, Inc. | Dynamic rule risk score determination in a cybersecurity monitoring system |
US20210004704A1 (en) * | 2019-07-02 | 2021-01-07 | Servicenow, Inc. | Dynamic anomaly reporting |
US20210021592A1 (en) * | 2019-07-17 | 2021-01-21 | Infiltron Holdings, Llc | Systems and methods for securing devices in a computing environment |
US20210103488A1 (en) * | 2019-10-03 | 2021-04-08 | Oracle International Corporation | Enhanced anomaly detection in computing environments |
US20210117232A1 (en) * | 2019-10-18 | 2021-04-22 | Splunk Inc. | Data ingestion pipeline anomaly detection |
US20210211443A1 (en) * | 2020-01-02 | 2021-07-08 | Code 42 Software, Inc. | Process for automated investigation of flagged users based upon previously collected data and automated observation on a go-forward basis |
US20210273957A1 (en) * | 2020-02-28 | 2021-09-02 | Darktrace Limited | Cyber security for software-as-a-service factoring risk |
US20210389997A1 (en) * | 2020-06-16 | 2021-12-16 | Microsoft Technology Licensing, Llc | Techniques for detecting atypical events in event logs |
US20210406106A1 (en) * | 2020-06-29 | 2021-12-30 | International Business Machines Corporation | Anomaly recognition in information technology environments |
US20210406112A1 (en) * | 2020-06-29 | 2021-12-30 | International Business Machines Corporation | Anomaly classification in information technology environments |
US20220107859A1 (en) * | 2020-10-02 | 2022-04-07 | Divya Choudhary | Method and system for determining root cause of anomalous events |
US20220141188A1 (en) * | 2020-10-30 | 2022-05-05 | Splunk Inc. | Network Security Selective Anomaly Alerting |
US11422882B1 (en) * | 2020-11-27 | 2022-08-23 | Amazon Technologies, Inc. | Systems, methods, and apparatuses for determining causality for anomalies and/or events |
US20220269577A1 (en) * | 2021-02-23 | 2022-08-25 | Mellanox Technologies Tlv Ltd. | Data-Center Management using Machine Learning |
US20220334904A1 (en) * | 2021-04-15 | 2022-10-20 | Viavi Solutions, Inc. | Automated Incident Detection and Root Cause Analysis |
US20220374809A1 (en) * | 2020-12-16 | 2022-11-24 | Wells Fargo Bank, N.A. | Computer-based tracking and determining impact of events on contact center operations |
US11526425B1 (en) * | 2020-03-30 | 2022-12-13 | Splunk Inc. | Generating metric data streams from spans ingested by a cloud deployment of an instrumentation analytics engine |
US20230007023A1 (en) * | 2021-06-30 | 2023-01-05 | Dropbox, Inc. | Detecting anomalous digital actions utilizing an anomalous-detection model |
US11593639B1 (en) * | 2019-09-03 | 2023-02-28 | Amazon Technologies, Inc. | Scoring events using noise-contrastive estimation for anomaly detection |
US11593669B1 (en) * | 2020-11-27 | 2023-02-28 | Amazon Technologies, Inc. | Systems, methods, and apparatuses for detecting and creating operation incidents |
-
2021
- 2021-10-14 US US17/501,617 patent/US11893125B2/en active Active
-
2023
- 2023-12-12 US US18/537,095 patent/US20240126910A1/en active Pending
Patent Citations (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030220940A1 (en) * | 2002-04-15 | 2003-11-27 | Core Sdi, Incorporated | Secure auditing of information systems |
US9009825B1 (en) * | 2013-06-21 | 2015-04-14 | Trend Micro Incorporated | Anomaly detector for computer networks |
US20170230229A1 (en) * | 2013-07-28 | 2017-08-10 | Opsclarity, Inc. | Ranking network anomalies in an anomaly cluster |
US9697100B2 (en) * | 2014-03-10 | 2017-07-04 | Accenture Global Services Limited | Event correlation |
US20170192872A1 (en) * | 2014-12-11 | 2017-07-06 | Hewlett Packard Enterprise Development Lp | Interactive detection of system anomalies |
US20180167402A1 (en) * | 2015-05-05 | 2018-06-14 | Balabit S.A. | Computer-implemented method for determining computer system security threats, security operations center system and computer program product |
US9537880B1 (en) * | 2015-08-19 | 2017-01-03 | Palantir Technologies Inc. | Anomalous network monitoring, user behavior detection and database system |
US20170111381A1 (en) * | 2015-08-19 | 2017-04-20 | Palantir Technologies Inc. | Anomalous network monitoring, user behavior detection and database system |
US20170060656A1 (en) * | 2015-08-31 | 2017-03-02 | Microsoft Technology Licensing, Llc | Predicting service issues by detecting anomalies in event signal |
US20170193239A1 (en) * | 2015-12-30 | 2017-07-06 | International Business Machines Corporation | Data-centric monitoring of compliance of distributed applications |
US9519698B1 (en) * | 2016-01-20 | 2016-12-13 | International Business Machines Corporation | Visualization of graphical representations of log files |
US20190052660A1 (en) * | 2016-02-05 | 2019-02-14 | Defensestorm, Inc. | Enterprise policy tracking with security incident integration |
US20180270265A1 (en) * | 2016-05-13 | 2018-09-20 | Ola Sage | System and Method For Assessing Network Security Risks |
US10841338B1 (en) * | 2017-04-05 | 2020-11-17 | Exabeam, Inc. | Dynamic rule risk score determination in a cybersecurity monitoring system |
US20190228296A1 (en) * | 2018-01-19 | 2019-07-25 | EMC IP Holding Company LLC | Significant events identifier for outlier root cause investigation |
US20190342307A1 (en) * | 2018-05-01 | 2019-11-07 | Royal Bank Of Canada | System and method for monitoring security attack chains |
US20210004704A1 (en) * | 2019-07-02 | 2021-01-07 | Servicenow, Inc. | Dynamic anomaly reporting |
US20210021592A1 (en) * | 2019-07-17 | 2021-01-21 | Infiltron Holdings, Llc | Systems and methods for securing devices in a computing environment |
US11593639B1 (en) * | 2019-09-03 | 2023-02-28 | Amazon Technologies, Inc. | Scoring events using noise-contrastive estimation for anomaly detection |
US20210103488A1 (en) * | 2019-10-03 | 2021-04-08 | Oracle International Corporation | Enhanced anomaly detection in computing environments |
US20210117232A1 (en) * | 2019-10-18 | 2021-04-22 | Splunk Inc. | Data ingestion pipeline anomaly detection |
US20210211443A1 (en) * | 2020-01-02 | 2021-07-08 | Code 42 Software, Inc. | Process for automated investigation of flagged users based upon previously collected data and automated observation on a go-forward basis |
US20210273957A1 (en) * | 2020-02-28 | 2021-09-02 | Darktrace Limited | Cyber security for software-as-a-service factoring risk |
US11526425B1 (en) * | 2020-03-30 | 2022-12-13 | Splunk Inc. | Generating metric data streams from spans ingested by a cloud deployment of an instrumentation analytics engine |
US20210389997A1 (en) * | 2020-06-16 | 2021-12-16 | Microsoft Technology Licensing, Llc | Techniques for detecting atypical events in event logs |
US20210406106A1 (en) * | 2020-06-29 | 2021-12-30 | International Business Machines Corporation | Anomaly recognition in information technology environments |
US20210406112A1 (en) * | 2020-06-29 | 2021-12-30 | International Business Machines Corporation | Anomaly classification in information technology environments |
US20220107859A1 (en) * | 2020-10-02 | 2022-04-07 | Divya Choudhary | Method and system for determining root cause of anomalous events |
US20220141188A1 (en) * | 2020-10-30 | 2022-05-05 | Splunk Inc. | Network Security Selective Anomaly Alerting |
US11422882B1 (en) * | 2020-11-27 | 2022-08-23 | Amazon Technologies, Inc. | Systems, methods, and apparatuses for determining causality for anomalies and/or events |
US11593669B1 (en) * | 2020-11-27 | 2023-02-28 | Amazon Technologies, Inc. | Systems, methods, and apparatuses for detecting and creating operation incidents |
US20220374809A1 (en) * | 2020-12-16 | 2022-11-24 | Wells Fargo Bank, N.A. | Computer-based tracking and determining impact of events on contact center operations |
US20220269577A1 (en) * | 2021-02-23 | 2022-08-25 | Mellanox Technologies Tlv Ltd. | Data-Center Management using Machine Learning |
US20220334904A1 (en) * | 2021-04-15 | 2022-10-20 | Viavi Solutions, Inc. | Automated Incident Detection and Root Cause Analysis |
US20230007023A1 (en) * | 2021-06-30 | 2023-01-05 | Dropbox, Inc. | Detecting anomalous digital actions utilizing an anomalous-detection model |
Also Published As
Publication number | Publication date |
---|---|
US20230117120A1 (en) | 2023-04-20 |
US20240126910A1 (en) | 2024-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11102221B2 (en) | Intelligent security management | |
US11182366B2 (en) | Comparing data stores using hash sums on disparate parallel systems | |
US10033754B2 (en) | Cyber threat monitor and control apparatuses, methods and systems | |
US11012289B2 (en) | Reinforced machine learning tool for anomaly detection | |
US20160210631A1 (en) | Systems and methods for flagging potential fraudulent activities in an organization | |
US11003563B2 (en) | Compliance testing through sandbox environments | |
US10366129B2 (en) | Data security threat control monitoring system | |
US10726123B1 (en) | Real-time detection and prevention of malicious activity | |
US10795738B1 (en) | Cloud security using security alert feedback | |
US11968213B2 (en) | Collaborative communications environment and privacy setting associated therewith | |
US20220405535A1 (en) | Data log content assessment using machine learning | |
US20240054013A1 (en) | Systems and methods for maintaining data objects to manage asynchronous workflows | |
US20220222266A1 (en) | Monitoring and alerting platform for extract, transform, and load jobs | |
EP4131008A1 (en) | Hyper-parameter space optimization for machine learning data processing pipeline using root cause analysis and corretive actions | |
US20240338390A1 (en) | Method and system for interpreting inputted information | |
US11893125B2 (en) | Providing a graphical representation of anomalous events | |
US20190097888A1 (en) | State-based entity behavior analysis | |
Leal et al. | Towards adaptive and transparent tourism recommendations: A survey | |
US11954216B1 (en) | Detecting durability issues with anomaly detection | |
US20240054115A1 (en) | Decision implementation with integrated data quality monitoring | |
US20240064166A1 (en) | Anomaly detection in computing system events | |
US20220027831A1 (en) | System and method for security analyst modeling and management | |
Frank et al. | Challenges and limitations of fraud detection in NoSQL database systems | |
Jannat et al. | Mutual Information on Low-rank Matrix for Effective Intrusion Detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: COHESITY, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JOHNSON, COLIN SCOTT;LI, MINGRAN;REEL/FRAME:059678/0265 Effective date: 20220330 |
|
AS | Assignment |
Owner name: SILICON VALLEY BANK, AS ADMINISTRATIVE AGENT, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:COHESITY, INC.;REEL/FRAME:061509/0818 Effective date: 20220922 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |