US20140304822A1 - Systems and Methods for Managing Data Incidents - Google Patents

Systems and Methods for Managing Data Incidents Download PDF

Info

Publication number
US20140304822A1
US20140304822A1 US14/311,253 US201414311253A US2014304822A1 US 20140304822 A1 US20140304822 A1 US 20140304822A1 US 201414311253 A US201414311253 A US 201414311253A US 2014304822 A1 US2014304822 A1 US 2014304822A1
Authority
US
United States
Prior art keywords
data
risk assessment
incident
privacy
notification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/311,253
Inventor
Mahmood Sher-Jan
Susan M. Rook
Greg L. Kotka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Radar LLC
Original Assignee
Identity Theft Guard Solutions LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US14/311,253 priority Critical patent/US20140304822A1/en
Application filed by Identity Theft Guard Solutions LLC filed Critical Identity Theft Guard Solutions LLC
Publication of US20140304822A1 publication Critical patent/US20140304822A1/en
Priority to US14/588,159 priority patent/US9483650B2/en
Priority to US14/868,311 priority patent/US9781147B2/en
Assigned to IDENTITY THEFT GUARD SOLUTIONS, INC. reassignment IDENTITY THEFT GUARD SOLUTIONS, INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: IDENTITY THEFT GUARD SOLUTIONS, LLC
Assigned to IDENTITY THEFT GUARD SOLUTIONS, LLC reassignment IDENTITY THEFT GUARD SOLUTIONS, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KOTKA, GREG L., SHER-JAN, MAHMOOD, ROOK, SUSAN M.
Assigned to RADAR, INC. reassignment RADAR, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: IDENTITY THEFT GUARD SOLUTIONS, INC.
Assigned to RADAR, INC. reassignment RADAR, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE STATE ADDRESS PREVIOUSLY RECORDED AT REEL: 039884 FRAME: 0305. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: IDENTITY THEFT GUARD SOLUTIONS, INC.
Priority to US15/339,786 priority patent/US10204238B2/en
Priority to US15/786,538 priority patent/US10331904B2/en
Priority to US16/235,872 priority patent/US10445508B2/en
Assigned to RADAR, LLC reassignment RADAR, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RADAR, INC.
Priority to US16/559,513 priority patent/US11023592B2/en
Priority to US17/221,624 priority patent/US20210224402A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1433Vulnerability analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1416Event detection, e.g. attack signature detection

Definitions

  • Embodiments of the disclosure relate to information privacy. More specifically, but not by way of limitation, the present technology relates to the management of data incidents.
  • the management of a data incident may comprise conducting an analysis of a data incident data relative to federal and state privacy rules and generating a risk assessment and incident response plan for the data incident. Additionally, the present technology may generate notification schedules and gather/transmit notification information for data incidents having a risk assessment that is indicative of a high level of risk.
  • Data incidents involve the exposure of sensitive information such as personally identifiable information and protected health information to third parties.
  • Data incidents may comprise data breaches, privacy breaches, privacy or security incidents, and other similar events that result in the exposure of sensitive information to third parties.
  • Some of these exposures may be subject to numerous state and federal statutes that delineate requirements that are to be imposed upon the party that was entrusted to protect the data.
  • Personally identifiable information hereinafter “PII”
  • PHI protected health information
  • the present technology may be directed to methods managing a data incident.
  • the methods may comprise: (a) receiving, via a risk assessment server, data incident data that comprises information corresponding to the data incident; (b) automatically generating, via the risk assessment server, a risk assessment from a comparison of data incident data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data incident notification laws; and (c) providing, via the risk assessment server, the risk assessment to a display device that selectively couples with the risk assessment server.
  • risk assessment server may comprise: (a) a memory for storing executable instructions; (b) a processor for executing the instructions; (c) an input module stored in memory and executable by the processor to receive data incident data, the data incident data comprising information corresponding to the data incident; (d) a risk assessment generator stored in memory and executable by the processor to generate a risk assessment from a comparison of the data incident data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data incident notification laws; and (e) a notification module stored in memory and executable by the processor to provide the risk assessment to a display device that selectively couples with the risk assessment server.
  • FIG. 1 illustrates an exemplary system for practicing aspects of the present technology
  • FIG. 2 illustrates an exemplary conversion application for managing data incidents
  • FIG. 3 illustrates an exemplary GUI in the form of a data incident details page
  • FIG. 4 illustrates an exemplary GUI in the form of a data incident dashboard
  • FIG. 5 illustrates an exemplary GUI in the form of a state specific risk assessment selection and notification page
  • FIG. 6 illustrates an exemplary GUI in the form of a data sensitivity level evaluation and selected federal and state specific risk assessments page
  • FIG. 7 illustrates an exemplary GUI in the form of a federal risk assessment page
  • FIG. 8 illustrates an exemplary GUI in the form of a state specific risk assessment page
  • FIG. 9 illustrates an exemplary GUI in the form of a statute summary page
  • FIG. 10 illustrates an exemplary GUI in the form of an aggregated notification schedules page
  • FIGS. 11-13 illustrate exemplary GUIS that are utilized to collect, store, and transmit pertinent documents or data
  • FIG. 14 is a flowchart of an exemplary method for managing a data incident.
  • FIG. 15 illustrates an exemplary computing device that may be used to implement embodiments according to the present technology.
  • the present technology may be directed to managing data incidents.
  • data incident may be understood to encompass privacy incidents, security incidents, privacy breaches, data breaches, data leaks, information breaches, data spills, or other similarly related events related to the intentional or unintentional release of protected information to an untrusted environment.
  • This protected information may be referred to as personally identifiable information (hereinafter “PII/PHI”) or protected health information (e.g., an entity that has been entrusted with the PHI such as a hospital, clinic, health plan, and so forth).
  • PII/PHI personally identifiable information
  • protected health information e.g., an entity that has been entrusted with the PHI such as a hospital, clinic, health plan, and so forth.
  • PII/PHI may encompass a wide variety of information types, but non-limiting examples of PII comprise an individual's full name, a date of birth, a birthplace, genetic information, biometric information (face, finger, handwriting, etc.), national identification number (e.g., social security), vehicle registration information, driver's license numbers, credit card numbers, digital identities, and Internet Protocol addresses.
  • PII comprise an individual's full name, a date of birth, a birthplace, genetic information, biometric information (face, finger, handwriting, etc.), national identification number (e.g., social security), vehicle registration information, driver's license numbers, credit card numbers, digital identities, and Internet Protocol addresses.
  • PII/PHI types of information may, in some instances, be categorized as PII/PHI, such as an individual's first or last name (separately), age, residence information (city, state, county, etc.), gender, ethnicity, employment (salary, employer, job description, etc.), and criminal records—just to name a few. It is noteworthy to mention that the types of information that are regarded as PII are subject to change and therefore may include more or fewer types of information that those listed above. Additionally, what constitutes PII/PHI may be specifically defined by a local, state, federal, or international data privacy laws.
  • the privacy laws contemplated herein may comprise details regarding not only how an entrusted entity determines if a data incident violates the law, but also when the provision of notification to one or more privacy agencies and/or the customers of the entrusted entity is warranted.
  • the present technology is directed to generating risk assessments for data incidents.
  • These risk assessments provides specific information to the entrusted entity regarding the severity of the data incident relative to a state or federal rule. Additionally, the risk assessment provides information regarding the data sensitivity for the data incident. That is, the risk assessment may determine if the type of data that was exposed is highly sensitive information. As mentioned before, some PII/PHI may be considered more sensitive than others. For example, a social security number may be more sensitive than a gender description, although the relative sensitivity for different categories of PII/PHI are typically delineated in the privacy rules and may require delineation in the context of each data incident.
  • the present technology may determine the severity and/or data sensitivity for a data incident by collecting data incident data from an entrusted entity. This data incident data may be compared against one or more selected privacy rules to determine the severity and/or data sensitivity for the data incident. In some instances, the present technology may model the data incident data to the one or more privacy rules.
  • the privacy rules described herein may comprise the content of a state and/or federal statute.
  • the privacy rules may comprise abstracted or mathematically expressed rules that have been generated from the text of the state and/or federal statute. Applying a privacy rule to the data incident data may yield values for the severity and/or the data sensitivity of the data incident.
  • the risk assessment may provide indication to the entrusted entity that an obligation has occurred. More specifically, if the severity of the data incident and/or the data sensitivity of the data incident when compared to the privacy rules indicates that the data incident has violated at least one of the privacy rules, the risk assessment may include an indication that an obligation has been created. An obligation may require the entrusted entity to notify subjected individuals that their PII/PHI has been potentially exposed. The obligation may also require that notification be provided to a regulating authority such as the department of Health and Human Services (HHS), Office for Civil Rights (OCR), Federal Trade Commission, a state agency, or any agency that regulates data incident notification.
  • HHS Health and Human Services
  • OCR Office for Civil Rights
  • Federal Trade Commission a state agency
  • any agency that regulates data incident notification may be provided to a regulating authority such as the department of Health and Human Services (HHS), Office for Civil Rights (OCR), Federal Trade Commission, a state agency, or any agency that regulates data incident notification.
  • HHS Health and Human Services
  • the present technology allows entrusted entities to model data incident data to privacy rules which include at least one state rule and at least one federal rule.
  • entrusted entities may model data incidents to the rules of several states to generate risk assessments of each of the states. This is particularly helpful when entrusted entities service customers in many states.
  • each of these states may have differing notification requirements, along with different metrics for determining when a data incident requires notification.
  • the risk assessment may include a risk level that is associated with a color. More specifically, a hue of the color is associated with the severity of the data incident as determined by the comparison or modeling if the data incident data.
  • the present technology may generate a notification schedule for an entrusted entity along with mechanisms that aid the entrusted entity in gathering pertinent information that is to be provided to the customer and/or one or more regulator agencies.
  • FIGS. 1-15 These and other advantages of the present technology will be described in greater detail with reference to the collective FIGS. 1-15 .
  • FIG. 1 illustrates an exemplary system 100 for practicing aspects of the present technology.
  • the system 100 may include a risk assessment system, hereinafter “system 105 ” that may be implemented in a cloud-based computing environment, or as a web server that is particularly purposed to manage data incidents.
  • system 105 a risk assessment system
  • FIG. 1 illustrates an exemplary system 100 for practicing aspects of the present technology.
  • the system 100 may include a risk assessment system, hereinafter “system 105 ” that may be implemented in a cloud-based computing environment, or as a web server that is particularly purposed to manage data incidents.
  • a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors and/or that combines the storage capacity of a large grouping of computer memories or storage devices.
  • systems that provide a cloud resource may be utilized exclusively by their owners; or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
  • the cloud may be formed, for example, by a network of web servers, with each web server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user places workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depend on the type of business associated with the user.
  • system 105 may include a distributed group of computing devices such as web servers that do not share computing resources or workload. Additionally, the system 105 may include a single computing device, such as a web server, that has been provisioned with one or more programs that are utilized to manage data incidents.
  • End users may access and interact with the system 105 via the client device 110 through a web-based interface, as will be discussed in greater detail infra.
  • end users may access and interact with the system 105 via a downloadable program that executes on the client device 110 .
  • the system 105 may selectively and communicatively couple with a client device 110 via a network connection 115 .
  • the network connection 115 may include any one of a number of private and public communications mediums such as the Internet.
  • system 105 may collect and transmit pertinent information to regulatory agencies, such as regulatory agency 120 , as will be discussed in greater detail infra. In some instances, notification may also be provided to affected individuals 125 .
  • the system 105 may be generally described as a mechanism for managing data incidents.
  • the system 105 may manage a data incident by collecting data incident data for the data incident and then modeling the data incident data to privacy rules.
  • the privacy rules may include at least one state rule and at least one federal rule.
  • the modeling of the data incident data may be utilized to generate a risk assessment for the data incident.
  • the risk assessment may be utilized by an entrusted entity to determine how best to respond to the data incident.
  • the system 105 is provided with a risk assessment application 200 that will be described in greater detail with reference to FIG. 2 .
  • FIG. 2 illustrates a risk assessment application, hereinafter referred to as application 200 .
  • the application 200 may generally include a user interface module 205 , an input module 210 , a risk assessment generator 215 , a notification module 220 , and a reporting module 225 . It is noteworthy that the application 200 may include additional modules, engines, or components, and still fall within the scope of the present technology. Moreover, the functionalities of two or more modules, engines, generators, or other components may be combined into a single component.
  • module may also refer to any of an application-specific integrated circuit (“ASIC”), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • ASIC application-specific integrated circuit
  • individual modules of the application 200 may include separately configured web servers.
  • the application 200 may be provisioned with a cloud.
  • the application 200 allows entrusted entities to input data incident data, have one or more risk assessments generated, and receive the one or more risk assessments, along with notifications schedules, as required.
  • An entrusted entity may interact with the application 200 via a graphical user interface that is provisioned as a web-based interface.
  • the web-based interface may be generated by the user interface module 205 .
  • the user interface module 205 may generate a plurality of different graphical user interfaces that allow individuals associated with the entrusted entity (e.g., privacy officer, compliance officer, security officer, attorney, employee, agent, etc.) to utilize interact with the application 200 . Examples of graphical user interfaces that are generated by the user interface module 205 are provided in FIGS. 3-13 , which will be described in greater detail infra.
  • the input module 210 may be executed to receive data incident data from the entrusted entity. It is noteworthy that the user interface module 205 may generate different types of graphical user interfaces that are tailored to obtain specific types of data incident data from the entrusted entity.
  • the entrusted entity may establish a profile that may be utilized to determine if the entity that is using the application 200 is, in fact, an entrusted entity. It is noteworthy that to mention that the determination of what entities are entrusted entities depends upon the privacy rule. For example, an entity may be considered to be an entrusted entity under a particular federal statute, but may not be labeled an entrusted entity under one or more state statutes. Likewise, different states may have discrepant methods for determining who constitutes an entrusted entity.
  • the input module 210 may be executed to solicit pertinent information from the entity that may be utilized to determine if the entity is an entrusted entity.
  • the entity may specify a plurality of states in which they conduct business, or the states of residence/domicile for customers with which they conduct business.
  • Pertinent data incident data may include the type of data that was compromised, the date of compromise, the amount of data that was compromised, were there security measures in place (e.g., encryption, redaction, etc.), was the incident intentional or unintentional, was the incident malicious or non-malicious, how the data was compromised (e.g., theft of laptop, database security failure, lost storage media, hacked application, hacked computing device (e.g., web server, email server, content repository, etc.), and other types of information that assist in determining a risk level for the data incident as well as any notification obligations.
  • security measures in place e.g., encryption, redaction, etc.
  • the input module 210 may select questions that solicit data that is particularly relevant to the privacy rules to which the entrusted entity is subject. For example, if a privacy rule specifies that a threshold amount of records must be exposed in order to create an obligation, the end user may be asked if their amount of exposed records meets or exceeds that threshold amount. This type of tailored questioning narrows the analysis that is performed of the data incident data and improves the efficiency of the risk assessment process.
  • the input module 210 may generate a summary of the data privacy data (or at least a portion of the data) that is provided to the entrusted entity via a graphical user interface generated by the user interface module 205 .
  • the input module 210 may be configured to solicit confirmation from the entrusted entity that the data privacy data in the summary is correct. If the data is incorrect, the entrusted entity may go back and correct the errant data.
  • the input module 210 may solicit and receive one or more selections of one or more states from the entrusted entity. Using the selections, the input module 210 may select one or more state statutes based upon the one or more selections. Also, the input module 210 may generate at least one state rule for each selected state statute. Additionally, one or more federal rules may be selected and generated as well.
  • the input module 210 may generate a state or federal privacy rule by evaluating the state/federal statute and creating a plurality of qualifications from the statutes.
  • Qualifications for a statute may include, for example, thresholds or formulas that are used to determine if the data incident data of a data incident violates the statute. Stated otherwise, these qualifications may be used as a mathematical model of a statute. Data incident data may be evaluated in light of the model. The resultant modeling may be used to generate a risk assessment for the data incident.
  • the risk assessment generator 215 may be executed to generate one or more risk assessments for the data incident.
  • the risk assessment generator 215 may model the data incident data to the selected or determined privacy rules to determine if an obligation has been triggered under a privacy rule.
  • risk assessments may be generated by modeling the data incident data to at least one state rule and at least one federal rule.
  • the risk assessment may combine risk levels for each rule into a single risk assessment, or individual risk assessments may be generated for each rule.
  • Modeling of the data incident data to a privacy rule (either state or federal) by the risk assessment generator 215 may result in the generation of a severity value and a data sensitivity value for the data incident.
  • the severity value may represent the extent to which PII/PHI has been compromised, while the data sensitivity value may represent the relative sensitivity of the PII/PHI that was compromised. These two factors may independently or dependently serve as the basis for determining if a notification obligation exists. For example, if the severity value meets or exceeds a threshold amount, a notification obligation may exist. If the data sensitivity value meets or exceeds a threshold amount, a notification obligation may exist. In some instance, a notification obligation may only exist if the sensitivity value and the data sensitivity value both exceed threshold amounts. Again, the threshold amounts are specified by the particular privacy rule that is being applied to the data incident data.
  • the risk assessment generator 215 may also determine and apply exceptions that exist in a state or federal statute during the generation of a risk assessment. These exceptions may be noted and included in the risk assessment.
  • the risk assessment generator 215 may create a visual indicator such as a risk level or heat map that assists the entrusted entity in determining if a data incident is relatively severe or is relatively benign.
  • This visual indicator may be included in the risk assessment.
  • a risk assessment may include a risk level that includes a visual indicator such as a colored object.
  • a hue of the object is associated with the severity of the data incident where red may indicate a severe risk and green may indicate a benign risk, with orange or yellow hues falling somewhere therebetween. Examples of heat maps and risk levels indicators are illustrated in FIG. 7 .
  • the risk assessment generator 215 may generate an outline of key information about the state statute that was utilized to generate the state specific risk assessment. This outline may be displayed to the entrusted entity via a user interface.
  • the notification module 220 may be executed to generate a notification schedule.
  • the notification schedule may be generated based upon a data associated with the data incident. That is, the statute may specify when notification is to occur, relative to the date that PII was exposed.
  • the notification schedule informs the entrusted entity as to what types of information are to be provided, along with the regulatory bodies to which the information should be provided.
  • the notification schedule may be generated from the statute itself.
  • a statute may specify that the data incident data (or a portion of the data incident data) collected by the input module 210 should be provided to a particular state agency within a predetermined period of time.
  • the notification schedule may include notification dates for each state agency.
  • the reporting module 225 may be executed to gather pertinent documents or other information from the entrusted entity and transmit these documents to the required reporting authorities.
  • the reporting module 225 may prompt the entrusted entity to attach documents via a user interface. Once attached, these documents/data may be stored in a secured repository for submission to regulatory agency. In other instances, the entrusted entity may transmit required information directly to the regulatory agency.
  • reporting module 225 may provide required notifications to affected individuals, such as the individuals associated with the PII/PHI that was compromised.
  • FIGS. 3-13 illustrate various exemplary graphical user interfaces (GUI) that are generated by the user interface module 205 .
  • GUI graphical user interfaces
  • FIG. 3 illustrates an exemplary GUI in the form of a data incident summary page.
  • the summary page 300 includes a plurality of received answers to questions that were provided to the entrusted entity. Responses that were received indicate that the data incident involved the loss of a cellular telephone, an incident date of Jan. 2, 2012, an incident discover date of Jan. 16, 2012, and other pertinent data incident data.
  • FIG. 4 illustrates an exemplary GUI in the form of a data incident dashboard page 400 .
  • the page 400 includes listing of pending and completed risk assessments for a plurality of data incidents. Each entry may include a risk indicator having a particular color to help the entrusted entity in quickly determining data incidents that are high risk.
  • a risk indicator may be associated with a particular privacy rule. For example, a risk indicator for an Employee Snooping data incident indicates that a moderately high risk is associated with the data incident relative to HITECH rules (e.g., rules associated with the compromise of PHI). This moderately high risk is indicated by a yellow dot placed within a row of a “HITECH Status” column. Additionally, a severe risk is associated with a state privacy rule. This severe risk is indicated by a red dot placed within a row of a “State Impact” column.
  • FIG. 5 illustrates an exemplary GUI in the form of a state specific selection and notification page 500 .
  • the notification page is shown as comprising an image that informs the trusted entity that six states have been affected by the data incident. To view a risk assessment for each state, the trusted entity may click on any of the stated listed in the leftmost frame.
  • FIG. 6 illustrates an exemplary GUI in the form of a data sensitivity level evaluation page 600 .
  • the page includes a plurality of data sensitivity indicators the sensitivity for different types of PII/PHI that were compromised by the data incident.
  • medical record numbers are shown in red as being highly sensitive.
  • medical record numbers may pose financial, reputational, and medical harm, which are just some of the dimensions of potential harm caused by compromise of PII/PHI.
  • the data incident also compromised individual's date of birth. As determined by entrusted entity, that type of PII/PHI is not considered highly sensitive and thus, has been depicted in green.
  • FIG. 7 illustrates an exemplary GUI in the form of a risk assessment page 700 .
  • the risk assessment page 700 includes a heat map 705 and corresponding risk level indicator 715 , which is placed within the heat map 705 .
  • the heat map 710 includes a grid where vertical placement indicates data sensitivity level and horizontal placement indicates severity level. As is shown, as the sensitivity and severity levels increase, so do the odds that the data incident may trigger an obligation to notify affected parties. In this instance, the risk level is high because the sensitivity level is high and the severity level is extreme.
  • a notification schedule Positioned below the heat map 705 is a notification schedule that includes not only the obligations for the entrusted entity, but also the expected notification dates. Again, this schedule may be based upon requirements included in the violated statute.
  • FIG. 8 illustrates an exemplary GUI in the form of a state specific risk assessment page 800 .
  • the page 800 includes a risk assessment for the State of California. The state impact is shown as high and a summary of the types of PII/PHI that were exposed are summarized below the state impact indicator.
  • a notification schedule is included on the state specific risk assessment page 800 . It is noteworthy that a state specific risk assessment page may be generated for each affected state (such as the affected states listed on the state specific selection and notification page 500 of FIG. 5 .
  • FIG. 9 illustrates an exemplary GUI in the form of a statute summary page 900 .
  • the statute summary page 900 includes a copy (or a portion) of the privacy statutes (California Civil Code 1798.29 & 1798.82; California Health and Safety Code 1280.15) that were utilized to generate the state specific risk assessment that was provided on in FIG. 8 .
  • the summary also includes whether the state statutes include harm test and exceptions which are flagged by the risk assessment generator 215 according to the specific privacy statutes.
  • FIG. 10 illustrates an exemplary GUI in the form of an aggregated notification page 1000 .
  • the page 1000 includes a notification schedule for each affected privacy statues (e.g., federal and state(s)) relative to one or more data incidents.
  • a list of notification events is provided and the end user may utilize the check boxes to select which states (or federal) risk assessment notification schedules are displayed.
  • FIGS. 11-13 illustrate exemplary GUIS that are utilized to collect, store, and transmit pertinent documents or data.
  • FIG. 11 illustrates an attachments page 1100 that shows a plurality of documents that have been uploaded to the system such as media notification, attorney general notification, privacy policy, and corrective action plan. Positioned adjacent to the list of documents is a checklist that includes all the pertinent documentation that is to be provided to regulatory authorities, the media, and/or affected individuals. As the required data are uploaded, each required data category is noted with a green check mark. Missing elements can be easily determined and uploaded.
  • FIG. 12 illustrates an upload page 1200 that may be utilized by an entrusted entity to upload and categorize required compliance information (e.g., documents shown in FIG. 11 ).
  • Files may be tagged with metadata linking them to the related federal and states risk assessments before they are stored in a content repository or transmitted to an appropriate party.
  • FIG. 13 illustrates an exemplary time stamped notation and actions page 1300 that displays notes entered into the system by a particular end user.
  • Actions may include a note that a particular employee is to be retrained and certified. Any type of related action such as a remedial action, uploading of a file, or other notification and/or compliance related action may be noted and associated with a particular risk assessment.
  • FIG. 14 illustrates a flowchart of an exemplary method for managing a data incident.
  • the method may include a step 1405 of receiving data incident data.
  • the data incident data may include information that pertains or corresponds to the data incident.
  • the method may include a step 1410 of automatically generating a risk assessment from a comparison of data incident data to privacy rules.
  • the privacy rules may comprise at least one federal rule and at least one state rule, where each of the rules defining requirements associated with data incident notification laws. Additionally, the comparison may include modeling the data incident data against privacy rules.
  • the method may include a step 1415 of providing the risk assessment to a display device that selectively couples with a risk assessment server. It is noteworthy to mention that the risk assessment may include a visual representation of the risk associated with a data incident relative to the privacy rules.
  • the method may include a step 1420 of generating a notification schedule for the data incident, along with an optional step 1425 of transmitting notification information to a regulatory agency and/or affected individuals (e.g. those who's PII/PHI has been compromised).
  • FIG. 15 illustrates an exemplary computing device 1500 that may be used to implement an embodiment of the present technology.
  • the computing device 1500 of FIG. 15 (or portions thereof) may be implemented in the context of system 105 ( FIG. 1 ).
  • the computing device 1500 of FIG. 15 includes one or more processors 1510 and main memory 1520 .
  • Main memory 1520 stores, in part, instructions and data for execution by processor 1510 .
  • Main memory 1520 may store the executable code when in operation.
  • the system 1500 of FIG. 15 further includes a mass storage device 1530 , portable storage medium drive(s) 1540 , output devices 1550 , user input devices 1560 , a graphics display 1570 , and peripheral devices 1580 .
  • FIG. 15 The components shown in FIG. 15 are depicted as being connected via a single bus 1590 .
  • the components may be connected through one or more data transport means.
  • Processor unit 1510 and main memory 1520 may be connected via a local microprocessor bus, and the mass storage device 1530 , peripheral device(s) 1580 , portable storage device 1540 , and display system 1570 may be connected via one or more input/output (I/O) buses.
  • I/O input/output
  • Mass storage device 1530 which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1510 . Mass storage device 1530 may store the system software for implementing embodiments of the present invention for purposes of loading that software into main memory 1520 .
  • Portable storage device 1540 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk, digital video disc, or USB storage device, to input and output data and code to and from the computing device 1500 of FIG. 15 .
  • a portable non-volatile storage medium such as a floppy disk, compact disk, digital video disc, or USB storage device
  • the system software for implementing embodiments of the present invention may be stored on such a portable medium and input to the computer device 1500 via the portable storage device 1540 .
  • Input devices 1560 provide a portion of a user interface.
  • Input devices 1560 may include an alphanumeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys.
  • the computing device 1500 as shown in FIG. 15 includes output devices 1550 . Suitable output devices include speakers, printers, network interfaces, and monitors.
  • Display system 1570 may include a liquid crystal display (LCD) or other suitable display device.
  • Display system 1570 receives textual and graphical information, and processes the information for output to the display device.
  • LCD liquid crystal display
  • Peripherals 1580 may include any type of computer support device to add additional functionality to the computer system.
  • Peripheral device(s) 1580 may include a modem or a router.
  • the components provided in the computing device 1500 of FIG. 15 are those typically found in computer systems that may be suitable for use with embodiments of the present invention and are intended to represent a broad category of such computer components that are well known in the art.
  • the computing device 1500 of FIG. 15 may be a personal computer, hand held computing device, telephone, mobile computing device, workstation, server, minicomputer, mainframe computer, or any other computing device.
  • the computer may also include different bus configurations, networked platforms, multi-processor platforms, etc.
  • Various operating systems may be used including Unix, Linux, Windows, Macintosh OS, Palm OS, Android, iPhone OS and other suitable operating systems.
  • the computing device 1500 may also utilize web browser applications that display the web-based graphical user interfaces described herein.
  • Exemplary web browser applications may include, but are not limited to, Internet Explorer, Firefox, Safari, Chrome, and other web browser applications that would be known to one of ordinary skill in the art with the present disclosure before them.
  • the computing device 1500 when the computing device 1500 is a mobile computing device, the computing device 1500 may likewise include mobile web browser applications.
  • Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU), a processor, a microcontroller, or the like. Such media may take forms including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic storage medium, a CD-ROM disk, digital video disk (DVD), any other optical storage medium, RAM, PROM, EPROM, a FLASHEPROM, any other memory chip or cartridge.

Abstract

Systems and methods for managing a data incident are provided herein. Exemplary methods may include receiving data breach data that comprises information corresponding to the data breach, automatically generating a risk assessment from a comparison of data breach data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data breach notification laws, and providing the risk assessment to a display device that selectively couples with the risk assessment server.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. Non-Provisional patent application Ser. No. 13/691,661 filed on Nov. 30, 2012 titled “Systems and Methods for Managing Data Incidents”, which is a continuation of U.S. Non-Provisional patent application Ser. No. 13/396,558 filed on Feb. 14, 2012 titled “Systems and Methods for Managing Data Incidents”, which are hereby incorporated by reference.
  • FIELD OF THE TECHNOLOGY
  • Embodiments of the disclosure relate to information privacy. More specifically, but not by way of limitation, the present technology relates to the management of data incidents. The management of a data incident may comprise conducting an analysis of a data incident data relative to federal and state privacy rules and generating a risk assessment and incident response plan for the data incident. Additionally, the present technology may generate notification schedules and gather/transmit notification information for data incidents having a risk assessment that is indicative of a high level of risk.
  • BACKGROUND OF THE DISCLOSURE
  • Data incidents involve the exposure of sensitive information such as personally identifiable information and protected health information to third parties. Data incidents may comprise data breaches, privacy breaches, privacy or security incidents, and other similar events that result in the exposure of sensitive information to third parties. Some of these exposures may be subject to numerous state and federal statutes that delineate requirements that are to be imposed upon the party that was entrusted to protect the data. Personally identifiable information (hereinafter “PII”) and protected health information (PHI) which, regards healthcare related information for individuals that are maintained by a covered entity (e.g., an entity that has been entrusted with the PHI such as a hospital, clinic, health plan, and so forth), may include, but is not limited to, healthcare, financial, political, criminal justice, biological, location, and/or ethnicity information. For purposes of brevity, although each of these types of PII and PHI may have distinct nomenclature, all of the aforementioned types information will be referred to herein as PII/PHI.
  • SUMMARY OF THE DISCLOSURE
  • According to some embodiments, the present technology may be directed to methods managing a data incident. The methods may comprise: (a) receiving, via a risk assessment server, data incident data that comprises information corresponding to the data incident; (b) automatically generating, via the risk assessment server, a risk assessment from a comparison of data incident data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data incident notification laws; and (c) providing, via the risk assessment server, the risk assessment to a display device that selectively couples with the risk assessment server.
  • According to other embodiments, the present technology is directed to a risk assessment server for managing a data incident. In some instances, risk assessment server may comprise: (a) a memory for storing executable instructions; (b) a processor for executing the instructions; (c) an input module stored in memory and executable by the processor to receive data incident data, the data incident data comprising information corresponding to the data incident; (d) a risk assessment generator stored in memory and executable by the processor to generate a risk assessment from a comparison of the data incident data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data incident notification laws; and (e) a notification module stored in memory and executable by the processor to provide the risk assessment to a display device that selectively couples with the risk assessment server.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed disclosure, and explain various principles and advantages of those embodiments.
  • The methods and systems disclosed herein have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
  • FIG. 1 illustrates an exemplary system for practicing aspects of the present technology;
  • FIG. 2 illustrates an exemplary conversion application for managing data incidents;
  • FIG. 3 illustrates an exemplary GUI in the form of a data incident details page;
  • FIG. 4 illustrates an exemplary GUI in the form of a data incident dashboard;
  • FIG. 5 illustrates an exemplary GUI in the form of a state specific risk assessment selection and notification page;
  • FIG. 6 illustrates an exemplary GUI in the form of a data sensitivity level evaluation and selected federal and state specific risk assessments page;
  • FIG. 7 illustrates an exemplary GUI in the form of a federal risk assessment page;
  • FIG. 8 illustrates an exemplary GUI in the form of a state specific risk assessment page;
  • FIG. 9 illustrates an exemplary GUI in the form of a statute summary page;
  • FIG. 10 illustrates an exemplary GUI in the form of an aggregated notification schedules page;
  • FIGS. 11-13 illustrate exemplary GUIS that are utilized to collect, store, and transmit pertinent documents or data;
  • FIG. 14 is a flowchart of an exemplary method for managing a data incident; and
  • FIG. 15 illustrates an exemplary computing device that may be used to implement embodiments according to the present technology.
  • DETAILED DESCRIPTION
  • In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosure. It will be apparent, however, to one skilled in the art, that the disclosure may be practiced without these specific details. In other instances, structures and devices are shown at block diagram form only in order to avoid obscuring the disclosure.
  • Generally speaking, the present technology may be directed to managing data incidents. It will be understood that the terms “data incident” may be understood to encompass privacy incidents, security incidents, privacy breaches, data breaches, data leaks, information breaches, data spills, or other similarly related events related to the intentional or unintentional release of protected information to an untrusted environment. This protected information may be referred to as personally identifiable information (hereinafter “PII/PHI”) or protected health information (e.g., an entity that has been entrusted with the PHI such as a hospital, clinic, health plan, and so forth).
  • PII/PHI may encompass a wide variety of information types, but non-limiting examples of PII comprise an individual's full name, a date of birth, a birthplace, genetic information, biometric information (face, finger, handwriting, etc.), national identification number (e.g., social security), vehicle registration information, driver's license numbers, credit card numbers, digital identities, and Internet Protocol addresses.
  • Other types of information may, in some instances, be categorized as PII/PHI, such as an individual's first or last name (separately), age, residence information (city, state, county, etc.), gender, ethnicity, employment (salary, employer, job description, etc.), and criminal records—just to name a few. It is noteworthy to mention that the types of information that are regarded as PII are subject to change and therefore may include more or fewer types of information that those listed above. Additionally, what constitutes PII/PHI may be specifically defined by a local, state, federal, or international data privacy laws.
  • While entities that are subject to these privacy laws may be referred to in a variety of ways, for consistency and clarity an entity (either individual or corporate) that is entrusted with PII/PHI will hereinafter be referred to as an “entrusted entity.”
  • It will be understood that the privacy laws contemplated herein may comprise details regarding not only how an entrusted entity determines if a data incident violates the law, but also when the provision of notification to one or more privacy agencies and/or the customers of the entrusted entity is warranted.
  • According to some embodiments, the present technology is directed to generating risk assessments for data incidents. These risk assessments provides specific information to the entrusted entity regarding the severity of the data incident relative to a state or federal rule. Additionally, the risk assessment provides information regarding the data sensitivity for the data incident. That is, the risk assessment may determine if the type of data that was exposed is highly sensitive information. As mentioned before, some PII/PHI may be considered more sensitive than others. For example, a social security number may be more sensitive than a gender description, although the relative sensitivity for different categories of PII/PHI are typically delineated in the privacy rules and may require delineation in the context of each data incident.
  • The present technology may determine the severity and/or data sensitivity for a data incident by collecting data incident data from an entrusted entity. This data incident data may be compared against one or more selected privacy rules to determine the severity and/or data sensitivity for the data incident. In some instances, the present technology may model the data incident data to the one or more privacy rules.
  • According to some embodiments, the privacy rules described herein may comprise the content of a state and/or federal statute. In other embodiments, the privacy rules may comprise abstracted or mathematically expressed rules that have been generated from the text of the state and/or federal statute. Applying a privacy rule to the data incident data may yield values for the severity and/or the data sensitivity of the data incident.
  • In some embodiments, the risk assessment may provide indication to the entrusted entity that an obligation has occurred. More specifically, if the severity of the data incident and/or the data sensitivity of the data incident when compared to the privacy rules indicates that the data incident has violated at least one of the privacy rules, the risk assessment may include an indication that an obligation has been created. An obligation may require the entrusted entity to notify subjected individuals that their PII/PHI has been potentially exposed. The obligation may also require that notification be provided to a regulating authority such as the department of Health and Human Services (HHS), Office for Civil Rights (OCR), Federal Trade Commission, a state agency, or any agency that regulates data incident notification.
  • The present technology allows entrusted entities to model data incident data to privacy rules which include at least one state rule and at least one federal rule. In some instances, entrusted entities may model data incidents to the rules of several states to generate risk assessments of each of the states. This is particularly helpful when entrusted entities service customers in many states. Moreover, each of these states may have differing notification requirements, along with different metrics for determining when a data incident requires notification.
  • In some embodiments, the risk assessment may include a risk level that is associated with a color. More specifically, a hue of the color is associated with the severity of the data incident as determined by the comparison or modeling if the data incident data.
  • According to the present disclosure, the present technology may generate a notification schedule for an entrusted entity along with mechanisms that aid the entrusted entity in gathering pertinent information that is to be provided to the customer and/or one or more regulator agencies.
  • These and other advantages of the present technology will be described in greater detail with reference to the collective FIGS. 1-15.
  • FIG. 1 illustrates an exemplary system 100 for practicing aspects of the present technology. The system 100 may include a risk assessment system, hereinafter “system 105” that may be implemented in a cloud-based computing environment, or as a web server that is particularly purposed to manage data incidents.
  • In general, a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors and/or that combines the storage capacity of a large grouping of computer memories or storage devices. For example, systems that provide a cloud resource may be utilized exclusively by their owners; or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.
  • The cloud may be formed, for example, by a network of web servers, with each web server (or at least a plurality thereof) providing processor and/or storage resources. These servers may manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user places workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depend on the type of business associated with the user.
  • In other embodiments, the system 105 may include a distributed group of computing devices such as web servers that do not share computing resources or workload. Additionally, the system 105 may include a single computing device, such as a web server, that has been provisioned with one or more programs that are utilized to manage data incidents.
  • End users may access and interact with the system 105 via the client device 110 through a web-based interface, as will be discussed in greater detail infra. Alternatively, end users may access and interact with the system 105 via a downloadable program that executes on the client device 110. The system 105 may selectively and communicatively couple with a client device 110 via a network connection 115. The network connection 115 may include any one of a number of private and public communications mediums such as the Internet.
  • Additionally, the system 105 may collect and transmit pertinent information to regulatory agencies, such as regulatory agency 120, as will be discussed in greater detail infra. In some instances, notification may also be provided to affected individuals 125.
  • The system 105 may be generally described as a mechanism for managing data incidents. The system 105 may manage a data incident by collecting data incident data for the data incident and then modeling the data incident data to privacy rules. As mentioned previously, the privacy rules may include at least one state rule and at least one federal rule. The modeling of the data incident data may be utilized to generate a risk assessment for the data incident. The risk assessment may be utilized by an entrusted entity to determine how best to respond to the data incident. The system 105 is provided with a risk assessment application 200 that will be described in greater detail with reference to FIG. 2.
  • FIG. 2 illustrates a risk assessment application, hereinafter referred to as application 200. In accordance with the present disclosure, the application 200 may generally include a user interface module 205, an input module 210, a risk assessment generator 215, a notification module 220, and a reporting module 225. It is noteworthy that the application 200 may include additional modules, engines, or components, and still fall within the scope of the present technology. Moreover, the functionalities of two or more modules, engines, generators, or other components may be combined into a single component.
  • As used herein, the terms “module,” “generator,” and “engine” may also refer to any of an application-specific integrated circuit (“ASIC”), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. In other embodiments, individual modules of the application 200 may include separately configured web servers. Also, the application 200 may be provisioned with a cloud.
  • Generally described, the application 200 allows entrusted entities to input data incident data, have one or more risk assessments generated, and receive the one or more risk assessments, along with notifications schedules, as required.
  • An entrusted entity may interact with the application 200 via a graphical user interface that is provisioned as a web-based interface. The web-based interface may be generated by the user interface module 205. It will be understood that the user interface module 205 may generate a plurality of different graphical user interfaces that allow individuals associated with the entrusted entity (e.g., privacy officer, compliance officer, security officer, attorney, employee, agent, etc.) to utilize interact with the application 200. Examples of graphical user interfaces that are generated by the user interface module 205 are provided in FIGS. 3-13, which will be described in greater detail infra.
  • Upon the occurrence of a data incident, the input module 210 may be executed to receive data incident data from the entrusted entity. It is noteworthy that the user interface module 205 may generate different types of graphical user interfaces that are tailored to obtain specific types of data incident data from the entrusted entity.
  • Initially, it may be desirous for the entrusted entity to establish a profile that may be utilized to determine if the entity that is using the application 200 is, in fact, an entrusted entity. It is noteworthy that to mention that the determination of what entities are entrusted entities depends upon the privacy rule. For example, an entity may be considered to be an entrusted entity under a particular federal statute, but may not be labeled an entrusted entity under one or more state statutes. Likewise, different states may have discrepant methods for determining who constitutes an entrusted entity.
  • Therefore, it may be advantageous to determine information about the entity such as what types of information they collect and where they conduct business. The input module 210 may be executed to solicit pertinent information from the entity that may be utilized to determine if the entity is an entrusted entity. Again, the entity may specify a plurality of states in which they conduct business, or the states of residence/domicile for customers with which they conduct business.
  • If it is determined that the entity is an entrusted entity, the input module may further solicit data incident data for one or more data incidents. Pertinent data incident data may include the type of data that was compromised, the date of compromise, the amount of data that was compromised, were there security measures in place (e.g., encryption, redaction, etc.), was the incident intentional or unintentional, was the incident malicious or non-malicious, how the data was compromised (e.g., theft of laptop, database security failure, lost storage media, hacked application, hacked computing device (e.g., web server, email server, content repository, etc.), and other types of information that assist in determining a risk level for the data incident as well as any notification obligations.
  • In some instances, rather than soliciting generalized data incident data from the entrusted entity, the input module 210 may select questions that solicit data that is particularly relevant to the privacy rules to which the entrusted entity is subject. For example, if a privacy rule specifies that a threshold amount of records must be exposed in order to create an obligation, the end user may be asked if their amount of exposed records meets or exceeds that threshold amount. This type of tailored questioning narrows the analysis that is performed of the data incident data and improves the efficiency of the risk assessment process.
  • Once the data privacy data has been received, the input module 210 may generate a summary of the data privacy data (or at least a portion of the data) that is provided to the entrusted entity via a graphical user interface generated by the user interface module 205.
  • The input module 210 may be configured to solicit confirmation from the entrusted entity that the data privacy data in the summary is correct. If the data is incorrect, the entrusted entity may go back and correct the errant data.
  • As mentioned briefly above, the input module 210 may solicit and receive one or more selections of one or more states from the entrusted entity. Using the selections, the input module 210 may select one or more state statutes based upon the one or more selections. Also, the input module 210 may generate at least one state rule for each selected state statute. Additionally, one or more federal rules may be selected and generated as well.
  • The input module 210 may generate a state or federal privacy rule by evaluating the state/federal statute and creating a plurality of qualifications from the statutes. Qualifications for a statute may include, for example, thresholds or formulas that are used to determine if the data incident data of a data incident violates the statute. Stated otherwise, these qualifications may be used as a mathematical model of a statute. Data incident data may be evaluated in light of the model. The resultant modeling may be used to generate a risk assessment for the data incident.
  • The risk assessment generator 215 may be executed to generate one or more risk assessments for the data incident. The risk assessment generator 215 may model the data incident data to the selected or determined privacy rules to determine if an obligation has been triggered under a privacy rule.
  • Again, risk assessments may be generated by modeling the data incident data to at least one state rule and at least one federal rule. The risk assessment may combine risk levels for each rule into a single risk assessment, or individual risk assessments may be generated for each rule.
  • Modeling of the data incident data to a privacy rule (either state or federal) by the risk assessment generator 215 may result in the generation of a severity value and a data sensitivity value for the data incident. The severity value may represent the extent to which PII/PHI has been compromised, while the data sensitivity value may represent the relative sensitivity of the PII/PHI that was compromised. These two factors may independently or dependently serve as the basis for determining if a notification obligation exists. For example, if the severity value meets or exceeds a threshold amount, a notification obligation may exist. If the data sensitivity value meets or exceeds a threshold amount, a notification obligation may exist. In some instance, a notification obligation may only exist if the sensitivity value and the data sensitivity value both exceed threshold amounts. Again, the threshold amounts are specified by the particular privacy rule that is being applied to the data incident data.
  • The risk assessment generator 215 may also determine and apply exceptions that exist in a state or federal statute during the generation of a risk assessment. These exceptions may be noted and included in the risk assessment.
  • The risk assessment generator 215 may create a visual indicator such as a risk level or heat map that assists the entrusted entity in determining if a data incident is relatively severe or is relatively benign. This visual indicator may be included in the risk assessment. For example, a risk assessment may include a risk level that includes a visual indicator such as a colored object. In some embodiments, a hue of the object is associated with the severity of the data incident where red may indicate a severe risk and green may indicate a benign risk, with orange or yellow hues falling somewhere therebetween. Examples of heat maps and risk levels indicators are illustrated in FIG. 7.
  • Included in the risk assessment, in some instances, is a summary of sections of the state or federal privacy statute. For example, with regard to a state specific assessment, the risk assessment generator 215 may generate an outline of key information about the state statute that was utilized to generate the state specific risk assessment. This outline may be displayed to the entrusted entity via a user interface.
  • If the risk assessment generator 215 determines that the data incident violates one or more statutes (e.g., high severity value, PII/PHI is very sensitive, etc.), the notification module 220 may be executed to generate a notification schedule. The notification schedule may be generated based upon a data associated with the data incident. That is, the statute may specify when notification is to occur, relative to the date that PII was exposed.
  • Additionally, the notification schedule informs the entrusted entity as to what types of information are to be provided, along with the regulatory bodies to which the information should be provided. Again, the notification schedule may be generated from the statute itself. For example, a statute may specify that the data incident data (or a portion of the data incident data) collected by the input module 210 should be provided to a particular state agency within a predetermined period of time. Again, if a plurality of states have been designated or selected, the notification schedule may include notification dates for each state agency.
  • To assist the entrusted entity in meeting their notification obligations, the reporting module 225 may be executed to gather pertinent documents or other information from the entrusted entity and transmit these documents to the required reporting authorities. The reporting module 225 may prompt the entrusted entity to attach documents via a user interface. Once attached, these documents/data may be stored in a secured repository for submission to regulatory agency. In other instances, the entrusted entity may transmit required information directly to the regulatory agency.
  • Additionally, the reporting module 225 may provide required notifications to affected individuals, such as the individuals associated with the PII/PHI that was compromised.
  • FIGS. 3-13 illustrate various exemplary graphical user interfaces (GUI) that are generated by the user interface module 205. Each of the exemplary user interfaces will be described below.
  • FIG. 3 illustrates an exemplary GUI in the form of a data incident summary page. The summary page 300 includes a plurality of received answers to questions that were provided to the entrusted entity. Responses that were received indicate that the data incident involved the loss of a cellular telephone, an incident date of Jan. 2, 2012, an incident discover date of Jan. 16, 2012, and other pertinent data incident data.
  • FIG. 4 illustrates an exemplary GUI in the form of a data incident dashboard page 400. The page 400 includes listing of pending and completed risk assessments for a plurality of data incidents. Each entry may include a risk indicator having a particular color to help the entrusted entity in quickly determining data incidents that are high risk. A risk indicator may be associated with a particular privacy rule. For example, a risk indicator for an Employee Snooping data incident indicates that a moderately high risk is associated with the data incident relative to HITECH rules (e.g., rules associated with the compromise of PHI). This moderately high risk is indicated by a yellow dot placed within a row of a “HITECH Status” column. Additionally, a severe risk is associated with a state privacy rule. This severe risk is indicated by a red dot placed within a row of a “State Impact” column.
  • FIG. 5 illustrates an exemplary GUI in the form of a state specific selection and notification page 500. The notification page is shown as comprising an image that informs the trusted entity that six states have been affected by the data incident. To view a risk assessment for each state, the trusted entity may click on any of the stated listed in the leftmost frame.
  • FIG. 6 illustrates an exemplary GUI in the form of a data sensitivity level evaluation page 600. The page includes a plurality of data sensitivity indicators the sensitivity for different types of PII/PHI that were compromised by the data incident. For example, medical record numbers are shown in red as being highly sensitive. Moreover, medical record numbers may pose financial, reputational, and medical harm, which are just some of the dimensions of potential harm caused by compromise of PII/PHI. In contrast, the data incident also compromised individual's date of birth. As determined by entrusted entity, that type of PII/PHI is not considered highly sensitive and thus, has been depicted in green.
  • FIG. 7 illustrates an exemplary GUI in the form of a risk assessment page 700. The risk assessment page 700 includes a heat map 705 and corresponding risk level indicator 715, which is placed within the heat map 705. The heat map 710 includes a grid where vertical placement indicates data sensitivity level and horizontal placement indicates severity level. As is shown, as the sensitivity and severity levels increase, so do the odds that the data incident may trigger an obligation to notify affected parties. In this instance, the risk level is high because the sensitivity level is high and the severity level is extreme.
  • Positioned below the heat map 705 is a notification schedule that includes not only the obligations for the entrusted entity, but also the expected notification dates. Again, this schedule may be based upon requirements included in the violated statute.
  • FIG. 8 illustrates an exemplary GUI in the form of a state specific risk assessment page 800. The page 800 includes a risk assessment for the State of California. The state impact is shown as high and a summary of the types of PII/PHI that were exposed are summarized below the state impact indicator. Similarly to the risk assessment page 700 of FIG. 7, a notification schedule is included on the state specific risk assessment page 800. It is noteworthy that a state specific risk assessment page may be generated for each affected state (such as the affected states listed on the state specific selection and notification page 500 of FIG. 5.
  • FIG. 9 illustrates an exemplary GUI in the form of a statute summary page 900. The statute summary page 900 includes a copy (or a portion) of the privacy statutes (California Civil Code 1798.29 & 1798.82; California Health and Safety Code 1280.15) that were utilized to generate the state specific risk assessment that was provided on in FIG. 8. Note that the summary also includes whether the state statutes include harm test and exceptions which are flagged by the risk assessment generator 215 according to the specific privacy statutes.
  • FIG. 10 illustrates an exemplary GUI in the form of an aggregated notification page 1000. The page 1000 includes a notification schedule for each affected privacy statues (e.g., federal and state(s)) relative to one or more data incidents. A list of notification events is provided and the end user may utilize the check boxes to select which states (or federal) risk assessment notification schedules are displayed.
  • FIGS. 11-13 illustrate exemplary GUIS that are utilized to collect, store, and transmit pertinent documents or data. FIG. 11 illustrates an attachments page 1100 that shows a plurality of documents that have been uploaded to the system such as media notification, attorney general notification, privacy policy, and corrective action plan. Positioned adjacent to the list of documents is a checklist that includes all the pertinent documentation that is to be provided to regulatory authorities, the media, and/or affected individuals. As the required data are uploaded, each required data category is noted with a green check mark. Missing elements can be easily determined and uploaded.
  • It is noteworthy to mention that the on-time reporting of required incident data may be paramount in determining compliance and good faith on the part of an entrusted entity. Consequently, failure to meet required notification deadlines may result in fines and other regulatory punishment.
  • FIG. 12 illustrates an upload page 1200 that may be utilized by an entrusted entity to upload and categorize required compliance information (e.g., documents shown in FIG. 11). Files may be tagged with metadata linking them to the related federal and states risk assessments before they are stored in a content repository or transmitted to an appropriate party.
  • FIG. 13 illustrates an exemplary time stamped notation and actions page 1300 that displays notes entered into the system by a particular end user. Actions may include a note that a particular employee is to be retrained and certified. Any type of related action such as a remedial action, uploading of a file, or other notification and/or compliance related action may be noted and associated with a particular risk assessment.
  • FIG. 14 illustrates a flowchart of an exemplary method for managing a data incident. The method may include a step 1405 of receiving data incident data. The data incident data may include information that pertains or corresponds to the data incident. Also, the method may include a step 1410 of automatically generating a risk assessment from a comparison of data incident data to privacy rules. The privacy rules may comprise at least one federal rule and at least one state rule, where each of the rules defining requirements associated with data incident notification laws. Additionally, the comparison may include modeling the data incident data against privacy rules. Also, the method may include a step 1415 of providing the risk assessment to a display device that selectively couples with a risk assessment server. It is noteworthy to mention that the risk assessment may include a visual representation of the risk associated with a data incident relative to the privacy rules.
  • Additionally, for data incidents that violate a privacy rule (either state or federal) the method may include a step 1420 of generating a notification schedule for the data incident, along with an optional step 1425 of transmitting notification information to a regulatory agency and/or affected individuals (e.g. those who's PII/PHI has been compromised).
  • FIG. 15 illustrates an exemplary computing device 1500 that may be used to implement an embodiment of the present technology. The computing device 1500 of FIG. 15 (or portions thereof) may be implemented in the context of system 105 (FIG. 1). The computing device 1500 of FIG. 15 includes one or more processors 1510 and main memory 1520. Main memory 1520 stores, in part, instructions and data for execution by processor 1510. Main memory 1520 may store the executable code when in operation. The system 1500 of FIG. 15 further includes a mass storage device 1530, portable storage medium drive(s) 1540, output devices 1550, user input devices 1560, a graphics display 1570, and peripheral devices 1580.
  • The components shown in FIG. 15 are depicted as being connected via a single bus 1590. The components may be connected through one or more data transport means. Processor unit 1510 and main memory 1520 may be connected via a local microprocessor bus, and the mass storage device 1530, peripheral device(s) 1580, portable storage device 1540, and display system 1570 may be connected via one or more input/output (I/O) buses.
  • Mass storage device 1530, which may be implemented with a magnetic disk drive or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit 1510. Mass storage device 1530 may store the system software for implementing embodiments of the present invention for purposes of loading that software into main memory 1520.
  • Portable storage device 1540 operates in conjunction with a portable non-volatile storage medium, such as a floppy disk, compact disk, digital video disc, or USB storage device, to input and output data and code to and from the computing device 1500 of FIG. 15. The system software for implementing embodiments of the present invention may be stored on such a portable medium and input to the computer device 1500 via the portable storage device 1540.
  • Input devices 1560 provide a portion of a user interface. Input devices 1560 may include an alphanumeric keypad, such as a keyboard, for inputting alpha-numeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. Additionally, the computing device 1500 as shown in FIG. 15 includes output devices 1550. Suitable output devices include speakers, printers, network interfaces, and monitors.
  • Display system 1570 may include a liquid crystal display (LCD) or other suitable display device. Display system 1570 receives textual and graphical information, and processes the information for output to the display device.
  • Peripherals 1580 may include any type of computer support device to add additional functionality to the computer system. Peripheral device(s) 1580 may include a modem or a router.
  • The components provided in the computing device 1500 of FIG. 15 are those typically found in computer systems that may be suitable for use with embodiments of the present invention and are intended to represent a broad category of such computer components that are well known in the art. Thus, the computing device 1500 of FIG. 15 may be a personal computer, hand held computing device, telephone, mobile computing device, workstation, server, minicomputer, mainframe computer, or any other computing device. The computer may also include different bus configurations, networked platforms, multi-processor platforms, etc. Various operating systems may be used including Unix, Linux, Windows, Macintosh OS, Palm OS, Android, iPhone OS and other suitable operating systems. The computing device 1500 may also utilize web browser applications that display the web-based graphical user interfaces described herein. Exemplary web browser applications may include, but are not limited to, Internet Explorer, Firefox, Safari, Chrome, and other web browser applications that would be known to one of ordinary skill in the art with the present disclosure before them. Moreover, when the computing device 1500 is a mobile computing device, the computing device 1500 may likewise include mobile web browser applications.
  • It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the technology. Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU), a processor, a microcontroller, or the like. Such media may take forms including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic storage medium, a CD-ROM disk, digital video disk (DVD), any other optical storage medium, RAM, PROM, EPROM, a FLASHEPROM, any other memory chip or cartridge.
  • While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the technology to the particular forms set forth herein. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments. It should be understood that the above description is illustrative and not restrictive. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the technology as defined by the appended claims and otherwise appreciated by one of ordinary skill in the art. The scope of the technology should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.

Claims (1)

What is claimed is:
1. A method for managing a data incident, comprising:
receiving, via a risk assessment server, data incident data that comprises information corresponding to the data incident; and
automatically generating, via the risk assessment server, a risk assessment from a comparison of the data incident data to privacy rules, the privacy rules comprising at least one federal rule and at least one state rule, each of the rules defining requirements associated with data incident notification laws.
US14/311,253 2012-02-14 2014-06-21 Systems and Methods for Managing Data Incidents Abandoned US20140304822A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US14/311,253 US20140304822A1 (en) 2012-02-14 2014-06-21 Systems and Methods for Managing Data Incidents
US14/588,159 US9483650B2 (en) 2012-02-14 2014-12-31 Systems and methods for managing data incidents
US14/868,311 US9781147B2 (en) 2012-02-14 2015-09-28 Systems and methods for managing data incidents
US15/339,786 US10204238B2 (en) 2012-02-14 2016-10-31 Systems and methods for managing data incidents
US15/786,538 US10331904B2 (en) 2012-02-14 2017-10-17 Systems and methods for managing multifaceted data incidents
US16/235,872 US10445508B2 (en) 2012-02-14 2018-12-28 Systems and methods for managing multi-region data incidents
US16/559,513 US11023592B2 (en) 2012-02-14 2019-09-03 Systems and methods for managing data incidents
US17/221,624 US20210224402A1 (en) 2012-02-14 2021-04-02 Systems and methods for managing data incidents having dimensions

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13/396,558 US8707445B2 (en) 2012-02-14 2012-02-14 Systems and methods for managing data incidents
US13/691,661 US8763133B2 (en) 2012-02-14 2012-11-30 Systems and methods for managing data incidents
US14/311,253 US20140304822A1 (en) 2012-02-14 2014-06-21 Systems and Methods for Managing Data Incidents

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/691,661 Continuation US8763133B2 (en) 2012-02-14 2012-11-30 Systems and methods for managing data incidents

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/588,159 Continuation-In-Part US9483650B2 (en) 2012-02-14 2014-12-31 Systems and methods for managing data incidents
US14/588,159 Continuation US9483650B2 (en) 2012-02-14 2014-12-31 Systems and methods for managing data incidents

Publications (1)

Publication Number Publication Date
US20140304822A1 true US20140304822A1 (en) 2014-10-09

Family

ID=48946786

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/396,558 Active US8707445B2 (en) 2012-02-14 2012-02-14 Systems and methods for managing data incidents
US13/691,661 Active US8763133B2 (en) 2012-02-14 2012-11-30 Systems and methods for managing data incidents
US14/311,253 Abandoned US20140304822A1 (en) 2012-02-14 2014-06-21 Systems and Methods for Managing Data Incidents
US14/588,159 Active US9483650B2 (en) 2012-02-14 2014-12-31 Systems and methods for managing data incidents

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US13/396,558 Active US8707445B2 (en) 2012-02-14 2012-02-14 Systems and methods for managing data incidents
US13/691,661 Active US8763133B2 (en) 2012-02-14 2012-11-30 Systems and methods for managing data incidents

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/588,159 Active US9483650B2 (en) 2012-02-14 2014-12-31 Systems and methods for managing data incidents

Country Status (1)

Country Link
US (4) US8707445B2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9483650B2 (en) 2012-02-14 2016-11-01 Radar, Inc. Systems and methods for managing data incidents
US9727919B2 (en) 2011-11-14 2017-08-08 Identity Theft Guard Solutions, Inc. Systems and methods for reducing medical claims fraud
US9781147B2 (en) 2012-02-14 2017-10-03 Radar, Inc. Systems and methods for managing data incidents
US9832219B2 (en) 2014-09-05 2017-11-28 International Business Machines Corporation System for tracking data security threats and method for same
US10204238B2 (en) 2012-02-14 2019-02-12 Radar, Inc. Systems and methods for managing data incidents
US10331904B2 (en) 2012-02-14 2019-06-25 Radar, Llc Systems and methods for managing multifaceted data incidents
US10348754B2 (en) 2015-12-28 2019-07-09 International Business Machines Corporation Data security incident correlation and dissemination system and method
US10367828B2 (en) 2014-10-30 2019-07-30 International Business Machines Corporation Action response framework for data security incidents
US10425447B2 (en) 2015-08-28 2019-09-24 International Business Machines Corporation Incident response bus for data security incidents
US11023592B2 (en) 2012-02-14 2021-06-01 Radar, Llc Systems and methods for managing data incidents
US11087024B2 (en) 2016-01-29 2021-08-10 Samsung Electronics Co., Ltd. System and method to enable privacy-preserving real time services against inference attacks
US11151468B1 (en) 2015-07-02 2021-10-19 Experian Information Solutions, Inc. Behavior analysis using distributed representations of event data
US11157650B1 (en) 2017-09-28 2021-10-26 Csidentity Corporation Identity security architecture systems and methods
US11436606B1 (en) 2014-10-31 2022-09-06 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US11568348B1 (en) 2011-10-31 2023-01-31 Consumerinfo.Com, Inc. Pre-data breach monitoring
US11785052B2 (en) 2016-06-21 2023-10-10 International Business Machines Corporation Incident response plan based on indicators of compromise

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150154520A1 (en) * 2012-03-30 2015-06-04 Csr Professional Services, Inc. Automated Data Breach Notification
US8966575B2 (en) * 2012-12-14 2015-02-24 Nymity Inc. Methods, software, and devices for automatically scoring privacy protection measures
US9507814B2 (en) 2013-12-10 2016-11-29 Vertafore, Inc. Bit level comparator systems and methods
US20150193720A1 (en) * 2014-01-08 2015-07-09 Bank Of America Corporation Assessing technology issues
US9411963B2 (en) * 2014-07-07 2016-08-09 International Business Machines Corporation Visual display of risk-identifying metadata for identity management access requests
US9747556B2 (en) 2014-08-20 2017-08-29 Vertafore, Inc. Automated customized web portal template generation systems and methods
US9407656B1 (en) * 2015-01-09 2016-08-02 International Business Machines Corporation Determining a risk level for server health check processing
US9600400B1 (en) 2015-10-29 2017-03-21 Vertafore, Inc. Performance testing of web application components using image differentiation
CN108322427A (en) * 2017-01-18 2018-07-24 阿里巴巴集团控股有限公司 A kind of method and apparatus carrying out air control to access request
US10776892B2 (en) 2017-12-19 2020-09-15 Motorola Solutions, Inc. Device, system and method for screening of personally identifiable information
US11776176B2 (en) 2019-04-19 2023-10-03 Microsoft Technology Licensing, Llc Visual representation of directional correlation of service health
US20220012750A1 (en) * 2020-07-10 2022-01-13 Venminder, Inc. Systems and methods for vendor exchange management
US20220198044A1 (en) * 2020-12-18 2022-06-23 Paypal, Inc. Governance management relating to data lifecycle discovery and management
CN113239404B (en) * 2021-06-04 2022-07-19 南开大学 Federal learning method based on differential privacy and chaotic encryption
CN114553516A (en) * 2022-02-18 2022-05-27 支付宝(杭州)信息技术有限公司 Data processing method, device and equipment
CN114676190B (en) * 2022-05-27 2022-10-11 太平金融科技服务(上海)有限公司深圳分公司 Data display method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120331567A1 (en) * 2010-12-22 2012-12-27 Private Access, Inc. System and method for controlling communication of private information over a network
US8707445B2 (en) * 2012-02-14 2014-04-22 Identity Theft Guard Solutions, Llc Systems and methods for managing data incidents

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9607041B2 (en) 1999-07-15 2017-03-28 Gula Consulting Limited Liability Company System and method for efficiently accessing internet resources
US7813944B1 (en) 1999-08-12 2010-10-12 Fair Isaac Corporation Detection of insurance premium fraud or abuse using a predictive software system
US20090313049A1 (en) 1999-12-18 2009-12-17 Raymond Anthony Joao Apparatus and Method for Processing and/or Providing Healthcare Information and/or Healthcare-Related Information
US20100042440A1 (en) 1999-12-18 2010-02-18 Raymond Anthony Joao Apparatus and method for processing and/or for providing healthcare information and/or healthcare-related information
AU2001276991A1 (en) 2000-07-20 2002-02-05 J. Alexander Marchosky Patient-controlled automated medical record, diagnosis, and treatment system andmethod
US7072842B2 (en) 2001-01-08 2006-07-04 P5, Inc. Payment of health care insurance claims using short-term loans
US20020120477A1 (en) * 2001-02-09 2002-08-29 Robert Jefferson Jinnett System and method for supporting legally-compliant automated regulated services and/or products in connection with multi-jurisdictional transactions
US6985922B1 (en) * 2001-12-21 2006-01-10 S.J. Bashen, Inc. Method, apparatus and system for processing compliance actions over a wide area network
US20030135397A1 (en) 2002-01-11 2003-07-17 Halow George M. Medical billing system to prevent fraud
US20030225690A1 (en) 2002-05-29 2003-12-04 Xerox Corporation Billing process and system
US7234065B2 (en) * 2002-09-17 2007-06-19 Jpmorgan Chase Bank System and method for managing data privacy
US7739132B2 (en) 2002-10-17 2010-06-15 Navicure, Inc. Correcting and monitoring status of health care claims
US8201256B2 (en) * 2003-03-28 2012-06-12 Trustwave Holdings, Inc. Methods and systems for assessing and advising on electronic compliance
US8725524B2 (en) 2003-08-13 2014-05-13 Accenture Global Services Limited Fraud detection method and system
US8600769B2 (en) 2004-05-19 2013-12-03 Fairpay Solutions, Inc. Medical bill analysis and review
US20080162496A1 (en) 2004-06-02 2008-07-03 Richard Postrel System and method for centralized management and monitoring of healthcare services
US7779457B2 (en) 2004-06-09 2010-08-17 Identifid, Inc Identity verification system
US20060020495A1 (en) 2004-07-20 2006-01-26 Baker Michael S Healthcare Claims Processing Mechanism for a Transaction System
US7904305B2 (en) 2005-04-29 2011-03-08 Suringa Dirk W R System and method for verifying the accurate processing of medical insurance claims
US20060277071A1 (en) 2005-06-03 2006-12-07 Shufeldt John J Patient receiving method
US20070038484A1 (en) 2005-08-15 2007-02-15 Hoffner Ronald M Methods and systems for health insurance claims submission and processing
US20070078668A1 (en) 2005-09-30 2007-04-05 Dimpy Pathria Authentication ID interview method and apparatus
US20070136814A1 (en) * 2005-12-12 2007-06-14 Michael Lee Critical function monitoring and compliance auditing system
US20080005778A1 (en) 2006-07-03 2008-01-03 Weifeng Chen System and method for privacy protection using identifiability risk assessment
CA2668289C (en) 2006-08-30 2014-01-28 Care Partners Plus Patient-interactive healthcare management
US20080177760A1 (en) 2007-01-18 2008-07-24 Edward Dennis Fein Methods and systems for contacting physicians
US20090210256A1 (en) 2008-02-15 2009-08-20 Aetna Inc. System For Real-Time Online Health Care Insurance Underwriting
US7996374B1 (en) 2008-03-28 2011-08-09 Symantec Corporation Method and apparatus for automatically correlating related incidents of policy violations
US20100114607A1 (en) 2008-11-04 2010-05-06 Sdi Health Llc Method and system for providing reports and segmentation of physician activities
US8185931B1 (en) 2008-12-19 2012-05-22 Quantcast Corporation Method and system for preserving privacy related to networked media consumption activities
US8707407B2 (en) 2009-02-04 2014-04-22 Microsoft Corporation Account hijacking counter-measures
US8484352B2 (en) 2009-03-30 2013-07-09 Rave Wireless, Inc. Emergency information services
US9727919B2 (en) 2011-11-14 2017-08-08 Identity Theft Guard Solutions, Inc. Systems and methods for reducing medical claims fraud
US9781147B2 (en) 2012-02-14 2017-10-03 Radar, Inc. Systems and methods for managing data incidents

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120331567A1 (en) * 2010-12-22 2012-12-27 Private Access, Inc. System and method for controlling communication of private information over a network
US8707445B2 (en) * 2012-02-14 2014-04-22 Identity Theft Guard Solutions, Llc Systems and methods for managing data incidents
US8763133B2 (en) * 2012-02-14 2014-06-24 Identity Theft Guard Solutions, Llc Systems and methods for managing data incidents

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11568348B1 (en) 2011-10-31 2023-01-31 Consumerinfo.Com, Inc. Pre-data breach monitoring
US9727919B2 (en) 2011-11-14 2017-08-08 Identity Theft Guard Solutions, Inc. Systems and methods for reducing medical claims fraud
US9781147B2 (en) 2012-02-14 2017-10-03 Radar, Inc. Systems and methods for managing data incidents
US10204238B2 (en) 2012-02-14 2019-02-12 Radar, Inc. Systems and methods for managing data incidents
US10331904B2 (en) 2012-02-14 2019-06-25 Radar, Llc Systems and methods for managing multifaceted data incidents
US9483650B2 (en) 2012-02-14 2016-11-01 Radar, Inc. Systems and methods for managing data incidents
US11023592B2 (en) 2012-02-14 2021-06-01 Radar, Llc Systems and methods for managing data incidents
US9832219B2 (en) 2014-09-05 2017-11-28 International Business Machines Corporation System for tracking data security threats and method for same
US10367828B2 (en) 2014-10-30 2019-07-30 International Business Machines Corporation Action response framework for data security incidents
US11436606B1 (en) 2014-10-31 2022-09-06 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US11941635B1 (en) 2014-10-31 2024-03-26 Experian Information Solutions, Inc. System and architecture for electronic fraud detection
US11151468B1 (en) 2015-07-02 2021-10-19 Experian Information Solutions, Inc. Behavior analysis using distributed representations of event data
US10425447B2 (en) 2015-08-28 2019-09-24 International Business Machines Corporation Incident response bus for data security incidents
US10348754B2 (en) 2015-12-28 2019-07-09 International Business Machines Corporation Data security incident correlation and dissemination system and method
US11087024B2 (en) 2016-01-29 2021-08-10 Samsung Electronics Co., Ltd. System and method to enable privacy-preserving real time services against inference attacks
US11785052B2 (en) 2016-06-21 2023-10-10 International Business Machines Corporation Incident response plan based on indicators of compromise
US11157650B1 (en) 2017-09-28 2021-10-26 Csidentity Corporation Identity security architecture systems and methods
US11580259B1 (en) 2017-09-28 2023-02-14 Csidentity Corporation Identity security architecture systems and methods

Also Published As

Publication number Publication date
US20150113663A1 (en) 2015-04-23
US20130212683A1 (en) 2013-08-15
US8707445B2 (en) 2014-04-22
US20130212692A1 (en) 2013-08-15
US9483650B2 (en) 2016-11-01
US8763133B2 (en) 2014-06-24

Similar Documents

Publication Publication Date Title
US8763133B2 (en) Systems and methods for managing data incidents
US11023592B2 (en) Systems and methods for managing data incidents
US20210224402A1 (en) Systems and methods for managing data incidents having dimensions
US10331904B2 (en) Systems and methods for managing multifaceted data incidents
US9781147B2 (en) Systems and methods for managing data incidents
US10204238B2 (en) Systems and methods for managing data incidents
US10963571B2 (en) Privacy risk assessments
Redmiles et al. I think they're trying to tell me something: Advice sources and selection for digital security
US11025675B2 (en) Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
Avin et al. Filling gaps in trustworthy development of AI
US20220286482A1 (en) Data processing systems and methods for performing assessments and monitoring of new versions of computer code for compliance
Sun et al. Defining security requirements with the common criteria: Applications, adoptions, and challenges
Paquet Consumer security perceptions and the perceived influence on adopting cloud computing: A quantitative study using the technology acceptance model
US11343284B2 (en) Data processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
Bottomley Data and algorithms in the workplace: an overview of current public policy strategies
US20200021496A1 (en) Method, apparatus, and computer-readable medium for data breach simulation and impact analysis in a computer network
Feeney et al. Using administrative data for randomized evaluations
Balebako et al. Is notice enough: Mitigating the risks of smartphone data sharing
Davidoff Data breaches: crisis and opportunity
Hyson Factors influencing the adoption of cloud computing by medical facility managers
Filkins New Threats Drive Improved Practices: State of Cybersecurity in Health Care Organizations
Stats Post-enumeration Survey: Privacy impact assessment
Charles Regulatory compliance considerations for blockchain in life sciences research
Bays Reactions to Ransomware Variants Among Internet Users: Measuring Payment Evocation
Hemann Mitigating It Security Risk in United States Healthcare: a Qualitative Examination of Best Practices

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: IDENTITY THEFT GUARD SOLUTIONS, INC., OREGON

Free format text: CHANGE OF NAME;ASSIGNOR:IDENTITY THEFT GUARD SOLUTIONS, LLC;REEL/FRAME:038978/0750

Effective date: 20130131

AS Assignment

Owner name: IDENTITY THEFT GUARD SOLUTIONS, LLC, OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHER-JAN, MAHMOOD;ROOK, SUSAN M.;KOTKA, GREG L.;SIGNING DATES FROM 20120504 TO 20120516;REEL/FRAME:038920/0071

AS Assignment

Owner name: RADAR, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IDENTITY THEFT GUARD SOLUTIONS, INC.;REEL/FRAME:039884/0305

Effective date: 20160802

AS Assignment

Owner name: RADAR, INC., OREGON

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE STATE ADDRESS PREVIOUSLY RECORDED AT REEL: 039884 FRAME: 0305. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:IDENTITY THEFT GUARD SOLUTIONS, INC.;REEL/FRAME:040209/0185

Effective date: 20160802

AS Assignment

Owner name: RADAR, LLC, OREGON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RADAR, INC.;REEL/FRAME:048885/0133

Effective date: 20190415