GB2586072A - Network vulnerability analysis - Google Patents

Network vulnerability analysis Download PDF

Info

Publication number
GB2586072A
GB2586072A GB1911051.9A GB201911051A GB2586072A GB 2586072 A GB2586072 A GB 2586072A GB 201911051 A GB201911051 A GB 201911051A GB 2586072 A GB2586072 A GB 2586072A
Authority
GB
United Kingdom
Prior art keywords
data
vulnerability
present
nested relational
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB1911051.9A
Other versions
GB201911051D0 (en
Inventor
Pimpalnerkar Prasanna
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1911051.9A priority Critical patent/GB2586072A/en
Publication of GB201911051D0 publication Critical patent/GB201911051D0/en
Publication of GB2586072A publication Critical patent/GB2586072A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1433Vulnerability analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/552Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

A system for analysing distributed network vulnerabilities imports raw data from a current vulnerability scan associated with a network asset and determines a relationship between data aspects, then generates first and second nested relational data structures for the current scan and a previous scan, respectively, then merges these nested relational structures to provide contextual vulnerability data which can be categorised as ‘unchanged’ or ‘added’ and reported. Remediation activity with an expected fix date may then be carried out.

Description

NETWORK VULNERABILITY ANALYSIS
The present invention relates to a system and method for correlating vulnerability data of a network over time and for correlating any data that is repeated.
In many network environments, unauthorised users may exploit vulnerabilities in the network to gain or deny access, or otherwise attack systems in the network. In order to detect and remediate such network vulnerabilities, existing network security systems typically conduct vulnerability analysis in the network through manual inspection or a combination of active and passive network scans. For example, conventional network scanners typically send packets or other messages to various devices in the network and then audit the network with information contained in any response packets or messages received from the devices in the network.
The packets received by the vulnerability scanner are used to create audit results for the network. A typical vulnerability scan may reveal thousands of vulnerabilities, depending on the size and composition of the network, comprising large volumes of raw data. The vulnerabilities are then required to be reviewed and marked for remediation.
Conventionally, the success of a remediation attempt is measured by performing a repeat vulnerability scan of the network and cross-referencing the number of newly identified vulnerabilities. However, the audit results become stale over time because the packets describe a static state for the network at that particular point in time.
If an initial scan reveals a number of vulnerabilities within the network and a remediation attempt is made on each, a subsequent scan that reveals the same number of vulnerabilities in the network does not necessarily mean that the remediation attempts were unsuccessful. All remediation attempts may, for example, have been successful with the subsequent scan revealing all new vulnerabilities. The static state of the scan results prevent both vulnerabilities and remediation attempts from being easily tracked over time.
It is an object of the present invention to provide a vendor agnostic platform to identify and track vulnerabilities within chronologically sequential scan results of a network to identify which vulnerabilities have been remediated, those that are new and those that are unchanged.
According to a first aspect of the present invention there is provided a system for analysing the computer security of a network, the system comprising: a computer security platform implemented in a distributed computing system, the distributed computing system comprising a non-transitory computer-readable medium for storing computer instructions thereon that when executed by one or more computer processors causes the computer security platform to: import a corpus of raw data from a current vulnerability scan of the network, the raw data comprising vulnerability data associated with at least one network asset; analyse the corpus of raw data to determine a data relationship between different aspects of the raw data; generate a first nested relational data structure based on the data relationship from the current vulnerability scan of the network; import a second nested relational data structure based on the data relationship from a previous vulnerability scan of the network; merge the first and second nested relational data structures to provide contextual vulnerability data for the network asset; categorise the contextual vulnerability data; and generate a report of the contextual vulnerability data.
According to a second aspect of the present invention there is provided a method for analysing the computer security of a network, the method comprising: importing a corpus of raw data from a current vulnerability scan of the network, the raw data comprising vulnerability data associated with at least one network asset; analysing the corpus of raw data to determine a data relationship between different aspects of the raw data; generating a first nested relational data structure based on the data relationship from the current vulnerability scan of the network; importing a second nested relational data structure based on the data relationship from a previous vulnerability scan of the network; merging the first and second nested relational data structures to provide contextual vulnerability data for the network asset; categorising the contextual vulnerability data; and generating a report of the contextual vulnerability data.
Advantageously, the object-relational data model used to sort the raw data allows for comparison of the same vulnerabilities in two chronologically sequential vulnerability scans. This, in turn, provides for a direct comparison between the two scans, adding a layer of temporal intelligence to the results. The contextual vulnerability tracks the drift (changes) in identified vulnerabilities over time.
Preferably, categorising the contextual vulnerability data includes at least one of: determining whether a vulnerability present in the first nested relational data structure is present in the second nested relational data structure, and if present, categorising the vulnerability as unchanged; determining whether the vulnerability present in the first nested relational data structure is not present in the second nested relational data structure, and if not present, categorising the vulnerability as added; and determining whether the vulnerability present in the second nested relational data structure is not present in the first nested relational data structure, and if not present, categorising the vulnerability as removed.
It is also preferred that the different aspects of the raw data include a scan data source identification "ID", an asset value, a vulnerability name, a host name and a scan date, for example.
In addition, it is preferred that the corpus of raw data comprises vulnerability data about a plurality of network assets.
Preferably, the corpus of raw data is imported at predetermined intervals of time.
In addition, it is preferred that the corpus of raw data comprises a plurality of data relationships.
It is also preferred that each nested relational data structure is assigned a date identifier corresponding to the date of the vulnerability scan on which it is based.
Preferably, the first and second nested relational data structures are based on chronologically sequential vulnerability scans of the network and/or is performed between two random dates to identify a drift between two dates.
Preferably, a vulnerability present in the first or second nested relational data structure is assigned a remediation activity and further still the remediation activity preferably includes an expected fix date.
In addition, it is preferred that the first nested relational data structure is generated after the expected fix date and preferably further comprises determining whether the vulnerability present in the second nested relational data structure is present in the first nested relational data structure, and if present, categorising the remediation activity as failed, otherwise categorising the remediation activity as successful (also known as a fix basket).
The present invention also includes a system for analysing data, the system comprising: a computing system comprising a non-transitory computer-readable medium for storing computer instructions thereon that when executed by one or more computer processors causes the system to: import a corpus of raw data from a source of raw data; analyse the corpus of raw data to determine a data relationship between different aspects of the raw data; generate a first nested relational data structure based on the data relationship from the corpus of raw data; import a second nested relational data structure based on the data relationship from the importation of a previous corpus of raw data from the same source of data; merge the first and second nested relational data structures to determine which data has been added, removed or remained unchanged; categorise the added, removed or unchanged data; and generate a report of the added, removed or unchanged data.
A specific embodiment of the present invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure 1 schematically illustrates a system for providing contextual vulnerability data of a network in accordance with the present invention; Figure 2 schematically illustrates a data analyser and components thereof which forms part of the system of Figure 1; Figure 3 schematically illustrates a database system and components thereof which forms pad of the data analyser of Figure 2; Figure 4 schematically illustrates a master logic and components thereof which forms pad of the data analyser of Figure 2; Figure 5 is a schematic overview of a two-step process for producing a running delta in accordance with the present invention; Figure 6 is a process flow diagram illustrating steps carried out by a data presenter, which forms part of the master logic of Figure 4, in order to normalise raw vulnerability data for analysis; Figure 7 is a process flow diagram illustrating steps carried out by the data analyser which forms part of the master logic of Figure 4 in order to provide contextualised vulnerability data; Figure 8 shows a screenshot of a data output which forms part of the master logic of Figure 4; Figure 9 shows a screenshot of a fix basket, which displays a report in accordance with the present invention; and Figure 10 shows a screenshot of the running delta of Figure 5 in graphical form.
Figure 1 schematically illustrates components and communication links of a system 100 used to correlate log data in order to discover network vulnerabilities and assets. The system 100 comprises a vulnerability scanner 102 communicatively coupled to a number of network assets 104, a corporate service desk 106 and an input data analyser 108. The vulnerability scanner 102 is arranged to receive and correlate data logs that include events from the various network assets 104 distributed across a network to detect abuses, statistical anomalies, compromises, compliance violations, and other information that may have relevance to a security posture or state associated with the network.
The vulnerability scanner 102 is configured to communicate packets or other messages within the network to detect new or changed information describing various network assets including routers, internal firewalls, external firewalls, or other suitable hosts in the network, wherein the hosts may generally include servers, desktop computers, mobile devices, or any other suitable device in the network. The vulnerability scanner 102 is configured to perform both credentialed audits and uncredentialed scans to scan certain routers, internal firewalls, external firewalls, or other hosts in the network and obtain information that may then be analysed to discover network assets and identify potential vulnerabilities in the network.
Specifically, the credentialed audits may involve the vulnerability scanner 102 using suitable authentication technologies to log into and obtain local access to the routers, internal firewalls, external firewalls, or other hosts in the network and perform any suitable operation that a local user could perform thereon without necessarily requiring a local agent. As such, the credentialed audits performed with the vulnerability scanner 102 may be used to obtain highly accurate host-based data that includes various client-side issues (e.g., missing patches, operating system settings, locally running services, etc.). The un-credentialed audits may generally include network-based scans that involve communicating packets or messages to the network assets including routers, internal firewalls, external firewalls, or other hosts in the network and observing responses thereto in order to identify certain vulnerabilities (e.g., if a particular host accepts spoofed packets, the vulnerability scanner 102 may determine that the host exposes a vulnerability that can be exploited to close established connections).
The audits are received in the form of log data by the vulnerability scanner 102 from the various network assets 104 and includes events that describe network activities, operating system activities, file modifications, USB device insertions, intrusion detection attempts, application executions, authentication attempts, and various other activities that may occur in the network.
The vulnerability scanner 102 aggregates, normalises, and correlates the events in the data logs received from the various network assets 104 distributed across the network to automatically detect statistical anomalies, identify intrusion events or other attempts to exploit previously discovered vulnerabilities in the network, or otherwise provide visibility into the network at particular point in time and associated the vulnerability with the source IP address. Conventionally, these vulnerabilities are manually reviewed and those that are considered critical for the business are identified for remediation.
Remediation activities are formally tracked for audit purposes using the corporate service desk tool 106, which is known in the art.
Input data analyser 108 leverages existing vulnerability data that the vulnerability scanner 102 utilises to report discovered vulnerabilities to add a layer of temporal intelligence to the results of the vulnerability scanner 102. The input data analyser 108 cross-references the existing vulnerability data for each identified network asset with the vulnerability data for the same asset taken from another point in time. The existing vulnerability data may, for example, be in the form of a structured data set gathered from the vulnerability scanner 102 or another data source (herein after collectively referred to as a "data source"). The cross-referenced vulnerability data provides a running delta (net difference) to track drifts (changes) for each asset over a period of time. The input data analyser 108 also leverages audit data from the corporate service desk tool 106 to correlate the observed drift for each asset against a request for remediation. In this way, drifts in network assets can be classified as either authorised or unauthorised. If the input data analyser 108 tracks a drift in a particular asset for which no remediation request has been generated, then this drift is classified as unauthorised. Conversely, if the input data analyser 108 tracks a drift in a particular asset for which a remediation request has been initiated, then this drift is classified as authorised.
The input data analyser 108 will now be described in more detail with reference to Figure 2. The features illustrated in Figure 2 which correspond to features already described in relation to Figure 1 are denoted by like reference numerals. In the illustrated embodiment, the input data analyser 108 comprises a component 110 for collecting an input of raw data from a data source and storing it in database system 114 in a manner controlled by master logic 112 and which can be analysed, aggregated, normalised and displayed via a user interface 116.
The component 110 connection to the data source is initiated over a standard Application Programming Interface (API) call, defined by the data source, using connection details stored within the database 114. The data is received from the data source via the component 110 and is stored in a record as raw data within the database system 114. The data can be imported incrementally on either a manual or automatic basis. Upon detection of a new data source, a data import is initiated by the component 110 and a new record within the database system 114 is created and populated with raw data from the data source.
Database system 114 is illustrated in detail in Figure 3 and consists in this particular embodiment of five databases communicatively coupled together (although it is anticipated that fewer than, or more than, five databases could be communicatively coupled tigether). The database system 114 consists of a master logic database 118, an analytics database 120, an API database 124, a fix basket database 122 and a raw data database 126. Raw data database 126 stores the data imported from the data source via the component 110. The input data analyser 108 is vendor agnostic and can store raw data in multiple structures in the raw data database 126 in accordance with the structure of the data source.
A relational model is used to describe a data relationship between different aspects of the same raw data stored in two different fields in the raw data database 126, taken at a particular point in time. The data relationship is stored in the master logic database 118. The master logic database 118 is a central store for all data relationships used within the input data analyser 108. The relational model comprises a parent-child hierarchy and creates a 2-dimensional view of the raw data at a particular point in time.
The essential aspects of the raw data captured in a data relationship are illustrated in TABLE 1 (below). Additional aspects can be used to group multiple data relationships together to form a data dashboard.
TABLE 1
Relationship ID XX Data Input Identifier Data Source Data ID Parent Child Vulnerability Name A running delta is generated from a comparison of the data relationship of the raw data taken at two different chronological points in time and stored in the analytics database 120. The most recent 2-dimensional view of the raw data given by the respective data relationship is assigned the identifier "Current Date" (CDt), while the previous 2-dimensional view is assigned the identifier "Previous Date" (PDt). For each subsequent 2-dimensional view of the data relationship, a count is incrementally increased depending on whether the running delta indicates a vulnerability has been "ADDED", "REMOVED", or whether there is "NO CHANGE" and is stored in corresponding fields of the same name in the analytics database 120. This incremental count is normalised providing a consistent data format suitable for rendering on the user interface 116.
The vulnerabilities are identified by the input data analyser 108 and, depending on the deemed importance of the vulnerabilities as determined in advance by a user of the system, certain vulnerabilities deemed to be critical are stored as entries in the fix basket database 122 along with additional audit information including "EXPECTED FIX DATE" to enrich the data associated with each critical vulnerability. Remediation activities are formally tracked based on the running delta of the asset generated after the "EXPECTED FIX DATE" to provide a "SUCCESS FACTOR" for each entry in the fix basket database 122. Critical vulnerabilities that no longer appear in subsequent vulnerability scan results performed after the "EXPECTED FIX DATE" are marked as "SUCCESS" indicating that the remediation attempt was successful. Conversely, critical vulnerabilities that remain in subsequent vulnerability scan results are marked as "FAIL". In this way, the user is provided with a unique "Success Factor" for each remediation attempt.
The results of the audit are presented to the user via the user interface 116. The content of the analytics database 120 may also be presented to the user via the user interface 116. The user interface 116 supports both modification and dynamic filtering of data stored within the database system 114.
Referring now to Figure 4, the master logic 112 used to generate the running delta is shown in more detail. In the illustrated embodiment, the master logic 112 comprises a data presenter 128 for collecting and preparing the data input used for analysis by data analyser 130 to generate the running delta which is then output by data output 132 and stored in the analytics database 120.
The running delta is generated by a two-step process. Firstly, the master logic 112 reads the data stored in the raw data database 126 imported from the data source and cross-references this with the corresponding data relationship stored in the master logic database 118 to produce a unique structured output referred to as "Output List (OL)". The Output List (OL) represents a snapshot of the vulnerabilities taken at a single point in time. Finally, a comparison of chronological Output Lists taken from two different time frames is performed by the data analyser 130 to produce the running delta. The running delta generated by the data analyser 130 is output in a consistent, fixed format that is then stored in the analytics database 120.
The two-step process will now be described in more detail with reference to Figure 5. As shown, a working data list is created using the data relationship between different aspects of the same raw data stored in two different fields in the raw data database 126. The first field in the data relationship is designated the "PARENT" field and the second field is designated the "CHILD". The data relationship is analysed to generate a list of CHILD entries for each PARENT entry. Multiple data relationships (PARENT-CHILD field groups) are created for each vulnerability scan. Each data relationship provides a 2-dimensional view of the raw data at a particular point in time. The relational model used to describe a data relationship allows the data relationships to be nested in the form of a data hierarchy. An example of a data relationship is given in TABLE 2 with corresponding input and output fields of an Output List (OL) to produce a list of vulnerabilities identified on a specific date.
Parent input: Data ID Child output: Vulnerability Name
TABLE 2
META DATA Relationship ID Data Input Identifier XXXX-1 Data Source Data ID Parent Child Vulnerability Name RESULTS Data ID Parent Value Date X Date X Child Value Vulnerability...1 Vulnerability...2 TABLE 3 shows a second example of a data relationship which has multiple nested PARENT entries. The input and output fields of the Output List (OL) are given below to produce a list of host names identified as suffering from a particular type of vulnerability at a point in time.
Parent input: Vulnerability Name Child output: Host Name
TABLE 3
META DATA
Relationship ID XX)0(-2 Data Input Identifier Data Source Parent Vulnerability Name Child Host Name RESULTS Data ID Parent Value Vulnerability...1 Vulnerability...2 Date X Child Value Host 1 Host...2 Host...1 Host...2 Host...7 The Output List (OL) is a structured output of the data stored in the raw data database 126 for each data relationship. The data analyser 130 performs a comparison of the Output Lists having the same Relationship ID taken at two different chronological points in time. The most recent Output List (OL) is assigned the identifier Current Date (CDt), while the previous Output List (OL) is assigned the identifier Previous Date (PDt). The result of the comparison identifies the running delta (net difference) between the Output Lists. For each comparison, a count is incrementally increased depending on whether the running delta between the two Output Lists indicate whether a particular vulnerability has been "ADDED", "REMOVED", or whether there is "NO CHANGE" and is stored in corresponding fields of the same name in the analytics database 120.
Specifically, a count is incremented in the field "ADDED" if an entry present in the results of the Current Date (CDt) Output List (OL) is not present in the results of the Previous Date (PDt) Output List (OL). Conversely, a count is incremented in the field "REMOVED" if an entry present in the results of the Previous Date (PDt) Output List (OL) is not present in the results of the Current Date (CDt) Output List (OL). Finally, a count is incremented in the field "NO CHANGE" if an entry present in the results of the Current Date (CDt) Output List (OL) is also present in the results of the Previous Date (PDt) Output List (OL). An example of a comparison between two Output Lists having the same Relationship ID is given below in TABLE 4. The results of the comparison performed by the data analyser 130 are then stored in the analytics database 120 as a running delta.
TABLE 4
META DATA
Relationship ID XXXX-2 Data Input Identifier Data Source Current Input Date Date X (most recent results) Previous Input Date Date Y (Most recent results -1) Parent Vulnerability Name Child Host Name
RESULTS
Parent Identifier Vulnerability...1
REMOVED NO CHANGE ADDED
Count 0 2 1 Values Host...1 Host...5 Host 7 Parent Identifier Vulnerability...2
REMOVED NO CHANGE ADDED
Count 1 2 1 Values Host...7 Host...1 Host...2 Host...5 The data presenter 128 will now be described in detail with reference to Figure 6. The data presenter 128 has two forms of input data. The first input is the Current Date (CDt) record of raw data stored in raw data database 126 from the most recent import of data from the data source. The corresponding data relationship records for the data source, stored in the master logic database 118, form the second input into the data presenter 128. Based on the data relationships of the second input, the data presenter 128 identifies the "PARENT" and "CHILD" fields in the raw data record of the first input and begins a nested "outer loop" join, also referred to as a nested iteration process.
As shown, the first identified "PARENT" field is checked to ensure that it is not the end of the "PARENT" entries before the process initiates a nested "inner loop" and reads all of the "CHILD" entries associated with the respective "PARENT". The first "CHILD" entry is checked to ensure that it is not the end of the "CHILD" entries for that particular "PARENT" entry before moving on to the next "PARENT" entry. The value of each "CHILD" entry is then assigned to the current "PARENT" entry. This nested "inner loop" process is repeated until all "CHILD" entries have been exhausted for the current "PARENT" entry and the nested "inner loop" is exited. This process is continued until all "PARENT" entries and their respective "CHILD" entries have been exhausted for the particular data relationship and the "outer loop" is exited. The output of this "outer loop" is designated the "Current Date -Output List per PARENT" (CDt-OL/P).
This process is then repeated with the first input changed to the Previous Date (PDt) record of raw data stored in raw data database 126 from a previous import of data from the data source. This produces a second output designated the "Previous Date -Output List per PARENT" (PDt-OUP). The process of generating the sequential PDt-OL/P and CDt-OL/P outputs is repeated for every corresponding data relationship stored in the master logic database 118 and is input to the data analyser 130 for analysis.
The data analyser 130 will now be described in detail with reference to Figure 7. As shown, the data analyser 130 has two inputs. The first input is the Previous Date -Output List per PARENT (PDt-OL/P) and the second input is the Current Date -Output List per PARENT (CDt-OL/P) for the corresponding data relationship. The data analyser 130 creates an amalgamated Output List based on the two inputs, in which all "CHILD" entries of each corresponding "PARENT" entry found in the PDt-OL/P and the CDt-OL/P are combined with any duplicates removed. This Output List is designated the "UNIQUE Output List per PARENT entry" (U-OUP). The "CHILD" entries contained in the U-OL/P are designated "U-CHILD" in Figure 7.
The data analyser 130 then identifies the first "PARENT" in the U-OL/P and begins a nested "outer loop" join. The value of the "PARENT is checked to ensure that it is not the end of the "PARENT" entries in the U-OL/P before the process initiates a nested "inner loop" and reads all of the "U-CHILD" entries associated with the respective "PARENT". The first "U-CHILD" entry is checked to ensure that it is not the end of the "U-CHILD" entries for that particular "PARENT" entry before moving on to the next "PARENT" entry.
The data analyser 130 cross-references the "U-CHILD" against the "CHILD" entries under the same "PARENT" in the corresponding PDt-OL/P and CDt-OL/P input lists. Using a series of logical operators, the data analyser 130 incrementally increases a count by 1 depending on whether the "U-CHILD" has been "ADDED" to the CDt-OL/P, "REMOVED" from the PDt-OL/P, or whether there is "NO CHANGE" in the "CHILD" between the PDt-OUP and CDt-OL/P and stored in corresponding fields of the same name. The value of the "U-CHILD" is then assigned to an output category corresponding to either "ADDED", "REMOVED, or "NO CHANGE".
Specifically, a count is incremented in the field "ADDED" if a "CHILD" entry, corresponding to the respective "U-CHILD", not present in the results of the PDt-OL/P is present in the results of the CDtOL/P. Conversely, a count is incremented in the field "REMOVED" if a "CHILD", corresponding to the respective "U-CHILD", present in the results of the PDt-OL/P is not present in the results of the CDt-OL/P. Finally, a count is incremented in the field "NO CHANGE" if a "CHILD" entry, corresponding to the respective "U-CHILD", present in the results of the CDt-OL/P is also present in the results of the PDt-OUP. The value of the "U-CHILD" is then assigned to an output category corresponding to either "ADDED", "REMOVED, or "NO CHANGE".
This nested "inner loop" process is repeated until all "U-CHILD" entries have been exhausted for the current "PARENT" entry and the nested "inner loop" is exited. This process is continued until all "PARENT" entries and their respective "U-CHILD" entries have been exhausted for the particular data relationship and the "outer loop" is exited. This is repeated for every corresponding data relationship stored in the master logic database 118.
As shown in Figure 8, the output of this "outer loop" populates each output category with a value and count for every "PARENT" analysed within the specific data relationship and is stored in the form of a running delta in the analytics database 120 and accessible to the user via the user interface 116. The running delta may be presented to the user via the user interface 116 in a tabular or graphical form, as shown in Figure 10, to provide contextualised vulnerability data.
As described, the running delta can be used in conjunction with additional information used to enrich the data associated with each critical vulnerability to formally track remediation activities and provide a "SUCCESS FACTOR' for each entry in the fix basket 122.
The fix basket database 122 will now be described in more detail with reference to Figure 9. As shown, a data relationship comprising a 2-dimensional view of the different aspects of the raw data at a particular point in time is provided to a user. A selection of vulnerabilities identified by the data relationship is then made by the user for remediation. The user may select the vulnerabilities associated with either a specific host or all hosts to remediate. The "Vulnerability Name" and associated "Host Name" is then added as a new entry to the fix basket database 122. Additional information identifying the data source may also be included in the entry to enrich the data.
The input data analyser 108 may be scheduled to review each entry in the fix basket database 122 after a predetermined amount of time has passed after the assigned "EXPECTED FIX DATE". The "EXPECTED FIX DATE" may be input manually by the user for each entry in the fix basket database 122 or automatically assigned based on a predetermined amount of time having passed after the assigned "Current Date" (CDt) associated with the data relationship.
Each vulnerability selected for remediation is stored as an entry in the fix basket database 122 using the Output List per PARENT data structure illustrated in TABLE 5 (below).
TABLE 5
META DATA Fix Basket ID Relationship ID Data Input Identifier Parent 1 XXXX-2 Data Source Vulnerability Name Child Current Date (CDt) EXPECTED FIX DATE (FDt) External Reference Host Name Date X Date Y Service Desk Ref TO FIX Data ID Parent Value Vulnerability...1 Vulnerability 2 Date X Child Value Host 1 Host...2 Host...1 Host...2 Host...7 Each entry in the fix basket database 122 is designated "Fix Date-Output List per PARENT" (FDt-OUP). In a similar manner to that described above with reference to Figure 7, the FDt-OL/P entry can be fed into the data analyser 130 as a first input. The corresponding Current Date -Output List per PARENT (CDt-OL/P) for the corresponding data relationship, generated from the most recent import of data after the expected fix date (FDt) value from the data source can be fed into the data analyser 130 as the second input.
The data analyser 130 creates an amalgamated Output List based on the two inputs, in which all "CHILD" entries of each corresponding "PARENT" entry found in the FDt-OL/P and the CDt-OL/P are combined with any duplicates removed. This Output List is designated the "UNIQUE Output List per PARENT entry" (U-OL/P).
The data analyser 130 then identifies the first "PARENT" in the U-OL/P and begins a nested "outer loop" join. The value of the "PARENT is checked to ensure that it is not the end of the "PARENT" entries in the U-OL/P before the process initiates a nested "inner loop" and reads all of the "U-CHILD" entries associated with the respective "PARENT". The first "U-CHILD" entry is checked to ensure that it is not the end of the "U-CHILD" entries for that particular "PARENT" entry before moving on to the next "PARENT" entry.
The data analyser 130 cross-references the "U-CHILD" against the "CHILD" entries under the same "PARENT" in the corresponding FDt-OL/P and CDt-OL/P input lists. Using a series of logical operators, the data analyser 130 incrementally increases a count by 1 depending on whether the "U-CHILD" has been "REMOVED" from the CDt-OLJP, or whether there is "NO CHANGE" in the "CHILD" between the FDt-OL/P and CDt-OL/P. The value of the "U-CHILD" is then assigned to an output category corresponding to either "SUCCESS" if the U-CHILD" has been "REMOVED" from the CDt-OL/P or "FAIL" if there is "NO CHANGE" in the "CHILD" between the FDt-OLJP and CDt-OUP. A cumulative result may then be presented to the user as a SUCCESS FACTOR for an entry in the fix basket database 122.
The result may be presented to the user in a simplified tabular and/or graphical form, providing the user with contextualised vulnerability data along with a corresponding remediation success factor.
The invention is not restricted to the details of the foregoing embodiment.
For example, in the embodiment as described, the first and second nested relational data structures are based on chronologically sequential vulnerability scans of the network. Alternatively, however, this can be performed between two random dates to identify drift between two dates -for example over a 12 month period without going through every result.
Ideally, the invention will perform a chronological check. However, the user may override this to perform a bespoke check between two random dates if desired.
Insofar as embodiments of the invention described above are implementable, at least in part, using a software-controlled programmable processing device such as a general purpose processor or special-purposes processor, digital signal processor, microprocessor, or other processing device, data processing apparatus or computer system it will be appreciated that a computer program for configuring a programmable device, apparatus or system to implement the foregoing described methods, apparatus and system is envisaged as an aspect of the present invention. The computer program may be embodied as any suitable type of code, such as source code, object code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language, such as C, C++, Java, BASIC, Perl, Matlab, Pascal, Visual BASIC, JAVA, ActiveX, assembly language, machine code, and so forth. A skilled person would readily understand that term "computer" in its most general sense encompasses programmable devices such as referred to above, and data processing apparatus and computer systems.
Suitably, the computer program is stored on a carrier medium in machine readable form, for example the carrier medium may comprise memory, removable or non-removable media, erasable or non-erasable media, writeable or re-writeable media, digital or analogue media, hard disk, floppy disk, Compact Disk Read Only Memory (CD-ROM), Company Disk Recordable (CD-R), Compact Disk Rewriteable (CD-RVV), optical disk, magnetic media, magneto-optical media, removable memory cards or disks, various types of Digital Versatile Disk (DVD) subscriber identity module, tape, cassette solid-state memory. The computer program may be supplied from a remote source embodied in the communications medium such as an electronic signal, radio frequency carrier wave or optical carrier waves. Such carrier media are also envisaged as aspects of the present invention. As used herein, the terms "comprises", "comprising', "includes", "including", "has", "having" or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, "or" refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the "a" or "an" are employed to describe elements and components of the invention. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
The scope of the present disclosure includes any novel feature or combination of features disclosed therein either explicitly or implicitly or any generalisation thereof irrespective of whether or not it relates to the claimed invention or mitigate against any or all of the problems addressed by the present invention. The applicant hereby gives notice that new claims may be formulated to such features during prosecution of this application or of any such further application derived therefrom. In particular, with reference to the appended claims, features from dependent claims may be combined with those of the independent claims and features from respective independent claims may be combined in any appropriate manner and not merely in specific combinations enumerated in the claims.

Claims (25)

  1. CLAIMS1. A system for analysing the computer security of a network, the system comprising: a computer security platform implemented in a distributed computing system, the distributed computing system comprising a non-transitory computer-readable medium for storing computer instructions thereon that when executed by one or more computer processors causes the computer security platform to: import a corpus of raw data from a current vulnerability scan of the network, the raw data comprising vulnerability data associated with at least one network asset; analyse the corpus of raw data to determine a data relationship between different aspects of the raw data; generate a first nested relational data structure based on the data relationship from the current vulnerability scan of the network; import a second nested relational data structure based on the data relationship from a previous vulnerability scan of the network; merge the first and second nested relational data structures to provide contextual vulnerability data for the network asset; categorise the contextual vulnerability data; and generate a report of the contextual vulnerability data.
  2. 2. The system of claim 1, wherein categorising the contextual vulnerability data includes at least one of: determining whether a vulnerability present in the first nested relational data structure is present in the second nested relational data structure, and if present, categorising the vulnerability as unchanged; determining whether the vulnerability present in the first nested relational data structure is not present in the second nested relational data structure, and if not present, categorising the vulnerability as added; and determining whether the vulnerability present in the second nested relational data structure is not present in the first nested relational data structure, and if not present, categorising the vulnerability as removed.
  3. 3. The system as claimed in any previous claim, wherein the different aspects of the raw data include a scan data source identification "ID", an asset value, a vulnerability name, a host name and a scan date.
  4. 4. The system as claimed in any previous claim, wherein the corpus of raw data comprises vulnerability data about a plurality of network assets.
  5. 5. The system as claimed in any previous claim, wherein the corpus of raw data is imported at predetermined intervals of time.
  6. 6. The system as claimed in any previous claim, wherein the corpus of raw data comprises a plurality of data relationships.
  7. 7. The system as claimed in any previous claim, wherein each nested relational data structure is assigned a date identifier corresponding to the date of the vulnerability scan on which it is based.
  8. 8. The system as claimed in claim 7, wherein the first and second nested relational data structures are based on chronologically sequential vulnerability scans of the network and/or is performed between two random dates to identify a drift between two dates.
  9. 9. The system as claimed in any previous claim, wherein a vulnerability present in the first or second nested relational data structure is assigned a remediation activity.
  10. 10. The system as claimed in claim 9, wherein the remediation activity includes an expected fix date.
  11. 11. The system as claimed in claim 10, wherein the first nested relational data structure is generated after the expected fix date.
  12. 12. The system as claimed in claim 11, further comprising: determining whether the vulnerability present in the second nested relational data structure is present in the first nested relational data structure, and if present, categorising the remediation activity as failed, otherwise categorising the remediation activity as successful.
  13. 13. A method for analysing the computer security of a network, the method comprising: importing a corpus of raw data from a current vulnerability scan of the network, the raw data comprising vulnerability data associated with at least one network asset; analysing the corpus of raw data to determine a data relationship between different aspects of the raw data; generating a first nested relational data structure based on the data relationship from the current vulnerability scan of the network; importing a second nested relational data structure based on the data relationship from a previous vulnerability scan of the network; merging the first and second nested relational data structures to provide contextual vulnerability data for the network asset; categorising the contextual vulnerability data; and generating a report of the contextual vulnerability data.
  14. 14. The method of claim 13, wherein categorising the contextual vulnerability data includes at least one of: determining whether a vulnerability present in the first nested relational data structure is present in the second nested relational data structure, and if present, categorising the vulnerability as unchanged; determining whether the vulnerability present in the first nested relational data structure is not present in the second nested relational data structure, and if not present, categorising the vulnerability as added; and determining whether the vulnerability present in the second nested relational data structure is not present in the first nested relational data structure, and if not present, categorising the vulnerability as removed.
  15. 15. The method as claimed in any of claims 13 or 14, wherein the different aspects of the raw data include a scan data source identification "ID", an asset value, a vulnerability name, a host name and a scan date.
  16. 16. The method as claimed in any of claims 13 to 15, wherein the corpus of raw data comprises vulnerability data about a plurality of network assets.
  17. 17. The method as claimed in any of claims 13 to 16, wherein the corpus of raw data is imported at predetermined intervals of time.
  18. 18. The method as claimed in any of claims 13 to 17, wherein the corpus of raw data comprises a plurality of data relationships.
  19. 19. The method as claimed in any of claims 13 to 18, wherein each nested relational data structure is assigned a date identifier corresponding to the date of the vulnerability scan on which it is based.
  20. 20. The method as claimed in claim 19, wherein the first and second nested relational data structures are based on chronologically sequential vulnerability scans of the network.
  21. 21. The method as claimed in any of claims 13 to 20, wherein a vulnerability present in the second nested relational data structure is assigned a remediation activity.
  22. 22. The method as claimed in claim 21, wherein the remediation activity includes an expected fix date.
  23. 23. The method as claimed in claim 22, wherein the first nested relational data structure is generated after the expected fix date.
  24. 24. The method as claimed in claim 23, further comprising: determining whether the vulnerability present in the second nested relational data structure is present in the first nested relational data structure, and if present, categorising the remediation activity as failed, otherwise categorising the remediation activity as successful.
  25. 25. A system for analysing data, the system comprising: a computing system comprising a non-transitory computer-readable medium for storing computer instructions thereon that when executed by one or more computer processors causes the system to: import a corpus of raw data from a source of raw data; analyse the corpus of raw data to determine a data relationship between different aspects of the raw data; generate a first nested relational data structure based on the data relationship from the corpus of raw data; import a second nested relational data structure based on the data relationship from the importation of a previous corpus of raw data from the same source of data; merge the first and second nested relational data structures to determine which data has been added, removed or remained unchanged; categorise the added, removed or unchanged data; and generate a report of the added, removed or unchanged data.
GB1911051.9A 2019-08-02 2019-08-02 Network vulnerability analysis Pending GB2586072A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1911051.9A GB2586072A (en) 2019-08-02 2019-08-02 Network vulnerability analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1911051.9A GB2586072A (en) 2019-08-02 2019-08-02 Network vulnerability analysis

Publications (2)

Publication Number Publication Date
GB201911051D0 GB201911051D0 (en) 2019-09-18
GB2586072A true GB2586072A (en) 2021-02-03

Family

ID=67990777

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1911051.9A Pending GB2586072A (en) 2019-08-02 2019-08-02 Network vulnerability analysis

Country Status (1)

Country Link
GB (1) GB2586072A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130239177A1 (en) * 2012-03-07 2013-09-12 Derek SIGURDSON Controlling enterprise access by mobile devices
US20170098087A1 (en) * 2015-10-06 2017-04-06 Assured Enterprises, Inc. Method and system for identification of security vulnerabilities
US20190102560A1 (en) * 2017-10-04 2019-04-04 Servicenow, Inc. Automated vulnerability grouping
US10270799B2 (en) * 2016-05-04 2019-04-23 Paladion Networks Private Limited Methods and systems for predicting vulnerability state of computer system
US20190166149A1 (en) * 2017-11-28 2019-05-30 Aetna Inc. Vulnerability contextualization

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130239177A1 (en) * 2012-03-07 2013-09-12 Derek SIGURDSON Controlling enterprise access by mobile devices
US20170098087A1 (en) * 2015-10-06 2017-04-06 Assured Enterprises, Inc. Method and system for identification of security vulnerabilities
US10270799B2 (en) * 2016-05-04 2019-04-23 Paladion Networks Private Limited Methods and systems for predicting vulnerability state of computer system
US20190102560A1 (en) * 2017-10-04 2019-04-04 Servicenow, Inc. Automated vulnerability grouping
US20190166149A1 (en) * 2017-11-28 2019-05-30 Aetna Inc. Vulnerability contextualization

Also Published As

Publication number Publication date
GB201911051D0 (en) 2019-09-18

Similar Documents

Publication Publication Date Title
US20220207020A1 (en) Anomaly detection
US9235629B1 (en) Method and apparatus for automatically correlating related incidents of policy violations
US9202189B2 (en) System and method of fraud and misuse detection using event logs
US9069954B2 (en) Security threat detection associated with security events and an actor category model
JP5941149B2 (en) System and method for evaluating an event according to a temporal position in an event sequence based on a reference baseline
US9251351B2 (en) System and method for grouping computer vulnerabilities
US9106682B2 (en) Method for directing audited data traffic to specific repositories
US8732455B2 (en) Method and system for securing against leakage of source code
US8185598B1 (en) Systems and methods for monitoring messaging systems
US8108550B2 (en) Real-time identification of an asset model and categorization of an asset to assist in computer network security
US20130081065A1 (en) Dynamic Multidimensional Schemas for Event Monitoring
KR20090067138A (en) Tracking changing state data to assist computer network security
US9477574B2 (en) Collection of intranet activity data
US20090328215A1 (en) Semantic networks for intrusion detection
US11940970B2 (en) Asset inventory reconciliation services for use in asset management architectures
CN113765881A (en) Method and device for detecting abnormal network security behavior, electronic equipment and storage medium
CN106202560A (en) A kind of method and device realizing database audit
CN104871171A (en) Distributed pattern discovery
CN116738449A (en) DSMM-based data security management and control and operation system
CN108650123B (en) Fault information recording method, device, equipment and storage medium
CN113434588B (en) Data mining analysis method and device based on mobile communication ticket
CN110516434B (en) Privileged account scanning system
GB2586072A (en) Network vulnerability analysis
WO2016173327A1 (en) Method and device for detecting website attack
CN116611046A (en) Method, device and system for processing weak password based on SOAR