US20160164898A1 - Simulated phishing result anonymization and attribution prevention - Google Patents
Simulated phishing result anonymization and attribution prevention Download PDFInfo
- Publication number
- US20160164898A1 US20160164898A1 US15/044,099 US201615044099A US2016164898A1 US 20160164898 A1 US20160164898 A1 US 20160164898A1 US 201615044099 A US201615044099 A US 201615044099A US 2016164898 A1 US2016164898 A1 US 2016164898A1
- Authority
- US
- United States
- Prior art keywords
- phishing
- individual
- individuals
- messages
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1441—Countermeasures against malicious traffic
- H04L63/1483—Countermeasures against malicious traffic service impersonation, e.g. phishing, pharming or web spoofing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1416—Event detection, e.g. attack signature detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
Definitions
- the present invention relates to methods, network devices and machine-readable media for preventing the malicious use of phishing simulation records, and more particularly relates to techniques for decoupling phishing simulation records from the contact information of individuals by means of an e-mail alias.
- an individual receives a message, commonly in the form of an e-mail or other electronic communication, directing the individual to perform an action, such as opening an e-mail attachment or following (e.g., using a cursor controlled device or touch screen) an embedded link.
- a message commonly in the form of an e-mail or other electronic communication
- an action such as opening an e-mail attachment or following (e.g., using a cursor controlled device or touch screen) an embedded link.
- a trusted source e.g., co-worker, bank, utility company or other well-known and trusted entity
- such message is from an attacker (e.g., an individual using a computing device to perform a malicious act on another computer device user) disguised as a trusted source, and an unsuspecting individual, for example, opening an attachment to view a “friend's photograph” might in fact install spyware, a virus, and/or other malware (i.e., malicious computer software) on his/her computer.
- an unsuspecting individual directed to a webpage made to look like an official banking webpage might be deceived into submitting his/her username, password, banking account number, etc. to an attacker.
- training programs may collect certain information that could be exploited by an attacker.
- training programs may maintain a measure of each individual's susceptibility to simulated phishing attacks. If an attacker were to gain access to such information, the attacker could specifically target those individuals determined to be highly susceptible to phishing attacks. Indeed, it would be ironic, but nevertheless detrimental, that a program designed to protect individuals from phishing attacks could be exploited by an attacker to more effectively attack the individuals.
- One approach to addressing such vulnerability is to decouple any phishing simulation record of an individual from his/her personal information (e.g., name, birth date, age, gender, etc.) and/or contact information (e.g., mailing address, telephone number, mobile number, e-mail address, etc.). That way, even if an attacker were to gain access to phishing simulation records (e.g., records of the number of phishing simulations an individual falls victim to, which types of phishing simulations an individual falls victim to, a measure of an individual's susceptibility to phishing attacks), the attacker would not be able to utilize such information in a manner that harms the individuals associated with the phishing simulation records.
- personal information e.g., name, birth date, age, gender, etc.
- contact information e.g., mailing address, telephone number, mobile number, e-mail address, etc.
- a training program is posed with the conflicting need to associate such phishing simulation records of individuals with those individual's contact information.
- a training program would ideally be able to provide those individuals with targeted and/or additional training materials.
- the inventors propose, in one embodiment of the invention, to associate each phishing simulation record of an individual with an e-mail alias of the individual. Any messages (e.g., simulated attacks, training materials) sent to the e-mail alias would be forwarded to a primary e-mail address of the individual, enabling the proper operation of a training program. Such e-mail alias, however, would be rendered invalid after a certain time period (e.g., after a simulation program has been completed) so that even if an attacker were to gain access to the phishing simulation records, the attacker would not be able to exploit same.
- a certain time period e.g., after a simulation program has been completed
- FIG. 1 depicts a schematic illustrating a mapping from phishing simulation records to e-mail aliases and a mapping from e-mail aliases to primary e-mail addresses, according to one embodiment of the invention
- FIG. 2 depicts a system diagram of components used in the administration of phishing simulations to individuals, according to one embodiment of the invention
- FIG. 3 depicts a specific example of how phishing simulations are administered to individuals via their e-mail aliases, according to one embodiment of the invention
- FIG. 4 depicts a flow diagram of a process to administer phishing simulations to individuals via their e-mail aliases, according to one embodiment of the invention.
- FIG. 5 depicts components of a computer system in which computer readable instructions instantiating the methods of the present invention may be stored and executed.
- phishing simulation records 12 may be associated with e-mail aliases (e.g., asp98r ⁇ at> company ⁇ dot> com, 983jas ⁇ at> company ⁇ dot> com, etc.) via mapping 14 .
- e-mail aliases e.g., asp98r ⁇ at> company ⁇ dot> com, 983jas ⁇ at> company ⁇ dot> com, etc.
- mapping 14 e.g., asp98r ⁇ at> company ⁇ dot> com, 983jas ⁇ at> company ⁇ dot> com, etc.
- phishing simulation record 1 is associated with the e-mail alias asp98r ⁇ at> company ⁇ dot> com.
- each of the e-mail aliases is associated with a primary e-mail address via mapping 16 .
- the e-mail alias asp98r ⁇ at> company ⁇ dot> com is associated with the primary e-mail address mary ⁇ at> company ⁇ dot> com. It is noted that, in the example provided in FIG. 1 , the domain names of the e-mail aliases and the primary e-mail addresses are the same (i.e., both are company ⁇ dot> com), but this is not necessarily so.
- an e-mail alias could be asp98r ⁇ at> company ⁇ dot> com, while the primary e-mail address associated with such e-mail alias could be mary ⁇ at> gmail ⁇ dot> com.
- the phrase “primary e-mail address”, in embodiments of the invention, refers to a more permanent e-mail address of an individual. This could be a company e-mail address, a personal e-mail address, etc.
- the primary e-mail address often incorporates one or more of the individual's first name, last name, nickname and other identifier of the individual in the local-part of the e-mail address (i.e., where the “local-part” of an e-mail address is the sequence of characters before the “ ⁇ at>” symbol), but this is not always so.
- a person could have more than one primary e-mail address. For instance, a person could have a gmail address for personal use and a company e-mail address for professional use. Either (or both) of these e-mail address could be considered a primary e-mail address.
- An e-mail alias is a forwarding e-mail address (i.e., messages sent to an e-mail alias of an individual are forwarded to the primary e-mail address of the individual).
- An e-mail alias of an individual may be established after the individual's primary e-mail address has been established, but this is not always so.
- an e-mail alias may be quite similar to a primary e-mail address.
- an e-mail alias may incorporate one or more of the individual's first name, last name, nickname and other identifier of the individual.
- an e-mail alias could be in use for a long period of time.
- e-mail alias in accordance with embodiments of the present invention, may be constructed in a more restrictive and/or limited fashion than e-mail aliases currently in use.
- an e-mail alias of an individual in accordance with embodiments of the present invention, does not incorporate any characteristic that may be associated with the identity of the individual (e.g., does not include the individual's first or last name, initials, nickname, birthday, etc.) and/or any other characteristic that could be used by an attacker to determine the identity/contact information of an individual.
- the local part of an e-mail alias may include a randomly generated sequence of alpha-numeric characters (e.g., “aa039js”).
- the local part of an e-mail alias may also include special characters (e.g., !, #, $, etc.) in addition to alpha-numeric characters, although there may be restrictions on the use of these special characters. Such details may be found in RFC 5322 and RFC 6531 and will not be discuss in further detail herein.
- FIG. 1 provides several example e-mail aliases which are suitable for protecting the identity/contact information of an individual. For instance, without the knowledge of mapping 16 , there really would be no way for an attacker (or anyone for that matter) to ascertain the primary e-mail address associated with the e-mail alias asp98r ⁇ at> company ⁇ dot> com.
- an e-mail alias in accordance with embodiments of the present invention, is active (e.g., able to send/receive messages) for a limited duration of time (e.g., 1 hour, 1 day, etc.).
- a limited duration of time e.g. 1 hour, 1 day, etc.
- any messages sent to the e-mail alias of an individual may be forwarded to the primary e-mail address of the individual.
- an e-mail alias is inactive, any messages sent to the e-mail alias may not be forwarded to the associated primary e-mail address.
- the duration of time that an e-mail alias is active may correspond to the time during which a phishing simulation is being conducted.
- an e-mail alias may be created for and assigned to an individual.
- phishing simulations and/or training material may be sent to the individual via the individual's e-mail alias. Any responses from the individual may also be received via the e-mail alias. More specifically, the individual may use his/her primary e-mail address to send a message (e.g., reply to a phishing simulation).
- Such message may then be forwarded from the primary e-mail address to the e-mail alias, so that the training program receives any response from the individual via his/her e-mail alias rather than via his/her primary e-mail address.
- Such technique decouples the training program from any primary e-mail addresses of individuals of the training program, precluding any information collected by the training program from being used to mount an attack on the individuals.
- the e-mail alias may be made inactive.
- FIG. 2 depicts system diagram 20 of components used in the administration of phishing simulations to individuals, according to one embodiment of the invention.
- Phishing simulation records 12 (of FIG. 2 ) is a more compact representation of the collection of phishing simulation records 12 (of FIG. 1 ).
- a phishing simulation record may comprise a measure of an individual's susceptibility to phishing attacks.
- the measure may include numbers from 1 to 10, with 1 indicating low susceptibility and 10 indicating high susceptibility.
- the measure may include a percentage from 0% to 100%, with 0% indicating that an individual has fallen victim to none of the phishing simulations and 100% indicating that the individual has fallen victim to all of the phishing simulations.
- a phishing simulation record may comprise the number of phishing simulations that an individual has fallen victim to.
- a phishing simulation record may indicate whether an individual has received and/or has reviewed training materials provided by the training program.
- phishing simulation records 12 may be communicatively coupled to record selector 22 .
- Record selector 22 may determine which of the phishing simulation records satisfies a criterion. For example, record selector 22 may determine which of the phishing simulation records has a measure of phishing susceptibility that exceeds a certain threshold. As another example, record selector 22 may determine which of the phishing simulation records contain a record of individuals falling victim to more than ten phishing simulations. Record selector 22 may then select at least one of the phishing simulation records that satisfies the criterion.
- record selector 22 may select all of the phishing simulation records that satisfy the criterion. As a specific example, record selector 22 may select “Phishing Simulation Record 1 ” and “Phishing Simulation Record 4 ”, as depicted in process 50 of FIG. 3 .
- record selector 22 may be communicatively coupled to phishing simulator 24 .
- phishing simulator 24 may be instructed to provide phishing simulations and/or training materials to individuals associated with certain phishing simulation records.
- Phishing simulator 24 may retrieve specific phishing simulations and/or training materials from phishing simulation data store 26 , those simulations and/or materials retrieved being properly matched to an individual associated with a selected phishing simulation record. For example, based on information from a phishing simulation record that an individual consistently fails to recognize phishing simulations with personalized salutations, phishing simulator 24 may provide that individual with training materials designed to increase his/her awareness of phishing simulations with personalized salutations.
- Phishing simulator 24 may access data store 14 which stores a mapping from phishing simulation records to e-mail aliases in order to determine an e-mail address through which an individual associated with a phishing simulation record can be contacted.
- phishing simulator 24 may access data store 14 to determine that e-mail alias asp98r ⁇ at> company ⁇ dot> com is associated with phishing simulation record 1 , and e-mail alias k8fne9 ⁇ at> company ⁇ dot> com is associated with phishing simulation record 4 .
- phishing simulator 24 may send messages (e.g., phishing simulations and/or training materials) to certain e-mail aliases via network 28 .
- messages e.g., phishing simulations and/or training materials
- phishing simulator 24 may send a simulated attack to asp98r ⁇ at> company ⁇ dot> com and a simulated attack to k8fne9 ⁇ at> company ⁇ dot> com.
- forwarding device 30 may detect that one or more messages have been sent to an individual's e-mail alias. Relying upon a mapping from e-mail aliases to primary e-mail addresses provided in data store 16 , forwarding device 30 may forward the one or more messages to a primary e-mail address of the individual. More specifically, the one or more messages may be forwarded to an e-mail inbox of the individual, as identified by the primary e-mail address of the individual, via network 32 and the individual's client machine.
- a simulated attack sent to asp98r ⁇ at> company ⁇ dot> com may be forwarded to mary ⁇ at> company ⁇ dot> com
- a simulated attack sent to k8fne9 ⁇ at> company ⁇ dot> com may be forwarded to john ⁇ at> company ⁇ dot> com, in accordance with the mapping provided in data store 16 .
- one or more of the e-mail aliases may become invalid, preventing those individuals whose e-mail aliases have become invalid (or deactivated) from receiving any further messages from their respective e-mail aliases while their respective e-mail aliases are invalid.
- An e-mail alias may be rendered invalid by removing certain associations from the mapping provided in data store 16 .
- e-mail aliases need not be permanently deactivated. Instead, they could be deactivated at the end of one phishing simulation and reactivated during a subsequent phishing simulation.
- a primary reason for using e-mail aliases and rendering them inactive after a certain period of time is to thwart an attacker's attempt to exploit phishing simulation records (in the event that the attacker gains access to same).
- the attacker has knowledge that an individual is highly susceptible to phishing attacks, such knowledge is of little use if the attacker has no way of contacting the individual (e.g., the attacker could attempt to send a phishing attack to an e-mail alias, but such attack would fail to reach the intended individual in the event that the e-mail alias has been rendered inactive).
- data store 14 is separate from data store 16 such that even if an attacker were to gain access to data store 14 , the attacker does not automatically also gain access to data store 16 .
- data store 14 may be physically separated from data store 16 (e.g., data store 14 and data store 16 may be separate devices and/or may be separated by network 28 ).
- phishing simulator 24 may be directly coupled to forwarding device 30 (i.e., network 28 is not present).
- the mapping present in data store 14 and the mapping present in data store 16 may be stored in a common data storage device.
- the mapping from e-mail aliases to primary e-mail addresses may be stored in an encrypted manner. As such, even if in attacker were to gain access to the phishing simulation records, the attacker will be unable to contact individuals associated with the phishing simulation records (assuming that the e-mail aliases have been rendered invalid).
- Training program may include one or more of the components of FIG. 2 : phishing simulation records 12 , record selector 22 , phishing simulator 24 , phishing simulations 26 and simulation record to alias mapping 14 .
- Forwarding device 30 and e-mail alias to primary e-mail mapping 16 may be present in a mail server which is coupled to the training program via network 28 .
- FIG. 4 depicts flow diagram 70 of a process to administer phishing simulations to individuals via e-mail aliases, according to one embodiment of the invention.
- a phishing simulation record of the individual may be associated with an e-mail alias of the individual. Such association may be recorded in data store 14 , as described above.
- record selector 22 may determine which of the phishing simulation records satisfies a criterion.
- record selector 22 may select at least one of the phishing simulation records which satisfies the criterion.
- phishing simulator 24 may, for each of the selected phishing simulation records, send one or more messages to the individual associated with the selected phishing simulation record via that individual's e-mail alias.
- FIG. 5 provides an example of computer system 100 that is representative of any of the client/server devices discussed herein. Further, computer system 100 is representative of a device that performs the process depicted in FIG. 4 . Note, not all of the various devices discussed herein may have all of the features of computer system 100 . For example, certain devices discussed above may not include a display inasmuch as the display function may be provided by a client computer communicatively coupled to computer system 100 or a display function may be unnecessary. Such details are not critical to the present invention.
- Computer system 100 includes a bus 102 or other communication mechanism for communicating information, and a processor 104 coupled with the bus 102 for processing information.
- Computer system 100 also includes a main memory 106 , such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 102 for storing information and instructions to be executed by processor 104 .
- Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 104 .
- Computer system 100 further includes a read only memory (ROM) 108 or other static storage device coupled to the bus 102 for storing static information and instructions for the processor 104 .
- ROM read only memory
- a storage device 110 which may be one or more of a floppy disk, a flexible disk, a hard disk, flash memory-based storage medium, magnetic tape or other magnetic storage medium, a compact disk (CD)-ROM, a digital versatile disk (DVD)-ROM, or other optical storage medium, or any other storage medium from which processor 104 can read, is provided and coupled to the bus 102 for storing information and instructions (e.g., operating systems, applications programs and the like).
- information and instructions e.g., operating systems, applications programs and the like.
- Computer system 100 may be coupled via the bus 102 to a display 112 , such as a flat panel display, for displaying information to a computer user.
- a display 112 such as a flat panel display
- An input device 114 is coupled to the bus 102 for communicating information and command selections to the processor 104 .
- cursor control device 116 is Another type of user input device
- cursor control device 116 such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 104 and for controlling cursor movement on the display 112 .
- Other user interface devices, such as microphones, speakers, etc. are not shown in detail but may be involved with the receipt of user input and/or presentation of output.
- processor 104 may be implemented by processor 104 executing appropriate sequences of computer-readable instructions contained in main memory 106 . Such instructions may be read into main memory 106 from another computer-readable medium, such as storage device 110 , and execution of the sequences of instructions contained in the main memory 106 causes the processor 104 to perform the associated actions.
- main memory 106 may be read into main memory 106 from another computer-readable medium, such as storage device 110 , and execution of the sequences of instructions contained in the main memory 106 causes the processor 104 to perform the associated actions.
- hard-wired circuitry or firmware-controlled processing units e.g., field programmable gate arrays
- processor 104 may be used in place of or in combination with processor 104 and its associated computer software instructions to implement the invention.
- the computer-readable instructions may be rendered in any computer language including, without limitation, C#, C/C++, Fortran, COBOL, PASCAL, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), JavaTM and the like.
- CORBA Common Object Request Broker Architecture
- Computer system 100 also includes a communication interface 118 coupled to the bus 102 .
- Communication interface 118 provides a two-way data communication channel with a computer network, which provides connectivity to and among the various computer systems discussed above.
- communication interface 118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, which itself is communicatively coupled to the Internet through one or more Internet service provider networks.
- LAN local area network
- Internet service provider networks The precise details of such communication paths are not critical to the present invention. What is important is that computer system 100 can send and receive messages and data through the communication interface 118 and in that way communicate with hosts accessible via the Internet.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Information Transfer Between Computers (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
Abstract
Description
- This application is a continuation of application Ser. No. 14/160,443, filed Jan. 21, 2014, the entire contents of which are incorporated herein by reference.
- The present invention relates to methods, network devices and machine-readable media for preventing the malicious use of phishing simulation records, and more particularly relates to techniques for decoupling phishing simulation records from the contact information of individuals by means of an e-mail alias.
- In a phishing attack, an individual (e.g., a person, an employee of a company, a user of a computing device) receives a message, commonly in the form of an e-mail or other electronic communication, directing the individual to perform an action, such as opening an e-mail attachment or following (e.g., using a cursor controlled device or touch screen) an embedded link. If such message were from a trusted source (e.g., co-worker, bank, utility company or other well-known and trusted entity), such action might carry little risk. Nevertheless, in a phishing attack, such message is from an attacker (e.g., an individual using a computing device to perform a malicious act on another computer device user) disguised as a trusted source, and an unsuspecting individual, for example, opening an attachment to view a “friend's photograph” might in fact install spyware, a virus, and/or other malware (i.e., malicious computer software) on his/her computer. Similarly, an unsuspecting individual directed to a webpage made to look like an official banking webpage might be deceived into submitting his/her username, password, banking account number, etc. to an attacker.
- While there are computer programs designed to detect and block phishing e-mails, phishing attacks methods are constantly being modified by attackers to evade such forms of detection. More recently, training programs have been developed to train users to recognize phishing attacks, such training involving simulated phishing attacks. While such training is beneficial, training programs may accumulate certain information about the users, which, if exploited by an attacker (e.g., attacker were able to gain access to same), could cause great harm to the participants of the training programs. The present invention addresses such potential vulnerabilities of training programs.
- The inventors have realized that training programs (e.g., providing employees of a company with simulated phishing attacks, followed by training materials), may collect certain information that could be exploited by an attacker. For example, training programs may maintain a measure of each individual's susceptibility to simulated phishing attacks. If an attacker were to gain access to such information, the attacker could specifically target those individuals determined to be highly susceptible to phishing attacks. Indeed, it would be ironic, but nevertheless detrimental, that a program designed to protect individuals from phishing attacks could be exploited by an attacker to more effectively attack the individuals.
- One approach to addressing such vulnerability is to decouple any phishing simulation record of an individual from his/her personal information (e.g., name, birth date, age, gender, etc.) and/or contact information (e.g., mailing address, telephone number, mobile number, e-mail address, etc.). That way, even if an attacker were to gain access to phishing simulation records (e.g., records of the number of phishing simulations an individual falls victim to, which types of phishing simulations an individual falls victim to, a measure of an individual's susceptibility to phishing attacks), the attacker would not be able to utilize such information in a manner that harms the individuals associated with the phishing simulation records.
- At the same time, a training program is posed with the conflicting need to associate such phishing simulation records of individuals with those individual's contact information. Upon identifying those individuals most susceptible to phishing attacks, a training program would ideally be able to provide those individuals with targeted and/or additional training materials.
- To satisfy both goals of protecting simulation records from being exploited by an attacker and allowing a training program to provide individuals with targeted and/or additional training materials, the inventors propose, in one embodiment of the invention, to associate each phishing simulation record of an individual with an e-mail alias of the individual. Any messages (e.g., simulated attacks, training materials) sent to the e-mail alias would be forwarded to a primary e-mail address of the individual, enabling the proper operation of a training program. Such e-mail alias, however, would be rendered invalid after a certain time period (e.g., after a simulation program has been completed) so that even if an attacker were to gain access to the phishing simulation records, the attacker would not be able to exploit same.
- These and further embodiments of the present invention are discussed herein.
- The present invention is illustrated by way of example, and not limitation, in the figures of the accompanying drawings in which:
-
FIG. 1 depicts a schematic illustrating a mapping from phishing simulation records to e-mail aliases and a mapping from e-mail aliases to primary e-mail addresses, according to one embodiment of the invention; -
FIG. 2 depicts a system diagram of components used in the administration of phishing simulations to individuals, according to one embodiment of the invention; -
FIG. 3 depicts a specific example of how phishing simulations are administered to individuals via their e-mail aliases, according to one embodiment of the invention; -
FIG. 4 depicts a flow diagram of a process to administer phishing simulations to individuals via their e-mail aliases, according to one embodiment of the invention; and -
FIG. 5 depicts components of a computer system in which computer readable instructions instantiating the methods of the present invention may be stored and executed. - In the following detailed description of the preferred embodiments, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the invention may be practiced. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
- As depicted in schematic 10 of
FIG. 1 ,phishing simulation records 12 may be associated with e-mail aliases (e.g., asp98r <at> company <dot> com, 983jas <at> company <dot> com, etc.) viamapping 14. In the example ofFIG. 1 , phishing simulation records of four individuals are depicted, and each of the phishing simulation records is associated with an e-mail alias of each of the individuals. While four phishing simulation records are depicted for ease of discussion, any number of phishing simulation records may be present. For example,phishing simulation record 1 is associated with the e-mail alias asp98r <at> company <dot> com. In turn, each of the e-mail aliases is associated with a primary e-mail address viamapping 16. For example, the e-mail alias asp98r <at> company <dot> com is associated with the primary e-mail address mary <at> company <dot> com. It is noted that, in the example provided inFIG. 1 , the domain names of the e-mail aliases and the primary e-mail addresses are the same (i.e., both are company <dot> com), but this is not necessarily so. In another example, an e-mail alias could be asp98r <at> company <dot> com, while the primary e-mail address associated with such e-mail alias could be mary <at> gmail <dot> com. - The phrase “primary e-mail address”, in embodiments of the invention, refers to a more permanent e-mail address of an individual. This could be a company e-mail address, a personal e-mail address, etc. The primary e-mail address often incorporates one or more of the individual's first name, last name, nickname and other identifier of the individual in the local-part of the e-mail address (i.e., where the “local-part” of an e-mail address is the sequence of characters before the “<at>” symbol), but this is not always so. In practice, a person could have more than one primary e-mail address. For instance, a person could have a gmail address for personal use and a company e-mail address for professional use. Either (or both) of these e-mail address could be considered a primary e-mail address.
- An e-mail alias is a forwarding e-mail address (i.e., messages sent to an e-mail alias of an individual are forwarded to the primary e-mail address of the individual). An e-mail alias of an individual may be established after the individual's primary e-mail address has been established, but this is not always so. Other than these functional and/or temporal distinctions, an e-mail alias may be quite similar to a primary e-mail address. Like a primary e-mail address, an e-mail alias may incorporate one or more of the individual's first name, last name, nickname and other identifier of the individual. Like a primary e-mail address, an e-mail alias could be in use for a long period of time.
- However, e-mail alias, in accordance with embodiments of the present invention, may be constructed in a more restrictive and/or limited fashion than e-mail aliases currently in use. Typically, an e-mail alias of an individual, in accordance with embodiments of the present invention, does not incorporate any characteristic that may be associated with the identity of the individual (e.g., does not include the individual's first or last name, initials, nickname, birthday, etc.) and/or any other characteristic that could be used by an attacker to determine the identity/contact information of an individual. In practice, the local part of an e-mail alias may include a randomly generated sequence of alpha-numeric characters (e.g., “aa039js”). The local part of an e-mail alias may also include special characters (e.g., !, #, $, etc.) in addition to alpha-numeric characters, although there may be restrictions on the use of these special characters. Such details may be found in RFC 5322 and RFC 6531 and will not be discuss in further detail herein.
FIG. 1 provides several example e-mail aliases which are suitable for protecting the identity/contact information of an individual. For instance, without the knowledge of mapping 16, there really would be no way for an attacker (or anyone for that matter) to ascertain the primary e-mail address associated with the e-mail alias asp98r <at> company <dot> com. - Typically, an e-mail alias, in accordance with embodiments of the present invention, is active (e.g., able to send/receive messages) for a limited duration of time (e.g., 1 hour, 1 day, etc.). When an e-mail alias is active, any messages sent to the e-mail alias of an individual may be forwarded to the primary e-mail address of the individual. When an e-mail alias is inactive, any messages sent to the e-mail alias may not be forwarded to the associated primary e-mail address.
- More particularly, the duration of time that an e-mail alias is active may correspond to the time during which a phishing simulation is being conducted. Before a phishing simulation begins, an e-mail alias may be created for and assigned to an individual. During the phishing simulation, phishing simulations and/or training material may be sent to the individual via the individual's e-mail alias. Any responses from the individual may also be received via the e-mail alias. More specifically, the individual may use his/her primary e-mail address to send a message (e.g., reply to a phishing simulation). Such message may then be forwarded from the primary e-mail address to the e-mail alias, so that the training program receives any response from the individual via his/her e-mail alias rather than via his/her primary e-mail address. Such technique decouples the training program from any primary e-mail addresses of individuals of the training program, precluding any information collected by the training program from being used to mount an attack on the individuals. When the phishing simulation concludes, the e-mail alias may be made inactive.
-
FIG. 2 depicts system diagram 20 of components used in the administration of phishing simulations to individuals, according to one embodiment of the invention. Phishing simulation records 12 (ofFIG. 2 ) is a more compact representation of the collection of phishing simulation records 12 (ofFIG. 1 ). More specifically, a phishing simulation record may comprise a measure of an individual's susceptibility to phishing attacks. The measure may include numbers from 1 to 10, with 1 indicating low susceptibility and 10 indicating high susceptibility. Alternatively, the measure may include a percentage from 0% to 100%, with 0% indicating that an individual has fallen victim to none of the phishing simulations and 100% indicating that the individual has fallen victim to all of the phishing simulations. Alternatively and/or in addition, a phishing simulation record may comprise the number of phishing simulations that an individual has fallen victim to. Alternatively and/or in addition, a phishing simulation record may indicate whether an individual has received and/or has reviewed training materials provided by the training program. - As depicted in
FIG. 2 , phishing simulation records 12 may be communicatively coupled torecord selector 22.Record selector 22, in one embodiment of the invention, may determine which of the phishing simulation records satisfies a criterion. For example,record selector 22 may determine which of the phishing simulation records has a measure of phishing susceptibility that exceeds a certain threshold. As another example,record selector 22 may determine which of the phishing simulation records contain a record of individuals falling victim to more than ten phishing simulations.Record selector 22 may then select at least one of the phishing simulation records that satisfies the criterion. In one instance,record selector 22 may select all of the phishing simulation records that satisfy the criterion. As a specific example,record selector 22 may select “Phishing Simulation Record 1” and “Phishing Simulation Record 4”, as depicted inprocess 50 ofFIG. 3 . - As depicted in
FIG. 2 ,record selector 22 may be communicatively coupled tophishing simulator 24. Based on information provided byrecord selector 22,phishing simulator 24 may be instructed to provide phishing simulations and/or training materials to individuals associated with certain phishing simulation records.Phishing simulator 24 may retrieve specific phishing simulations and/or training materials from phishingsimulation data store 26, those simulations and/or materials retrieved being properly matched to an individual associated with a selected phishing simulation record. For example, based on information from a phishing simulation record that an individual consistently fails to recognize phishing simulations with personalized salutations,phishing simulator 24 may provide that individual with training materials designed to increase his/her awareness of phishing simulations with personalized salutations. -
Phishing simulator 24 may accessdata store 14 which stores a mapping from phishing simulation records to e-mail aliases in order to determine an e-mail address through which an individual associated with a phishing simulation record can be contacted. As a specific example,phishing simulator 24 may accessdata store 14 to determine that e-mail alias asp98r <at> company <dot> com is associated withphishing simulation record 1, and e-mail alias k8fne9 <at> company <dot> com is associated withphishing simulation record 4. Based on information fromrecord selector 22, phishingsimulation data store 26 and simulation record to aliasmapping data store 14,phishing simulator 24 may send messages (e.g., phishing simulations and/or training materials) to certain e-mail aliases vianetwork 28. Continuing with the specific example provided inFIG. 3 ,phishing simulator 24 may send a simulated attack to asp98r <at> company <dot> com and a simulated attack to k8fne9 <at> company <dot> com. - Subsequently, forwarding
device 30 may detect that one or more messages have been sent to an individual's e-mail alias. Relying upon a mapping from e-mail aliases to primary e-mail addresses provided indata store 16, forwardingdevice 30 may forward the one or more messages to a primary e-mail address of the individual. More specifically, the one or more messages may be forwarded to an e-mail inbox of the individual, as identified by the primary e-mail address of the individual, vianetwork 32 and the individual's client machine. Returning to the specific sample ofFIG. 3 , a simulated attack sent to asp98r <at> company <dot> com may be forwarded to mary <at> company <dot> com, and a simulated attack sent to k8fne9 <at> company <dot> com may be forwarded to john <at> company <dot> com, in accordance with the mapping provided indata store 16. - After phishing simulations have concluded (or after a certain time duration has elapsed from the instant the e-mail aliases were created), one or more of the e-mail aliases may become invalid, preventing those individuals whose e-mail aliases have become invalid (or deactivated) from receiving any further messages from their respective e-mail aliases while their respective e-mail aliases are invalid. An e-mail alias may be rendered invalid by removing certain associations from the mapping provided in
data store 16. For instance, to render the e-mail alias asp98r <at> company <dot> com invalid, one may remove the association between asp98r <at> company <dot> com and mary <at> company <dot> com. Alternatively, such association from e-mail alias to primary e-mail address could be preserved indata store 16, but forwardingdevice 30 could be instructed to (temporarily) stop forwarding any messages from asp98r <at> company <dot> com to mary <at> company <dot> com. Indeed, e-mail aliases need not be permanently deactivated. Instead, they could be deactivated at the end of one phishing simulation and reactivated during a subsequent phishing simulation. - As discussed above, a primary reason for using e-mail aliases and rendering them inactive after a certain period of time is to thwart an attacker's attempt to exploit phishing simulation records (in the event that the attacker gains access to same). In accordance with techniques of one embodiment of the invention, even if the attacker has knowledge that an individual is highly susceptible to phishing attacks, such knowledge is of little use if the attacker has no way of contacting the individual (e.g., the attacker could attempt to send a phishing attack to an e-mail alias, but such attack would fail to reach the intended individual in the event that the e-mail alias has been rendered inactive).
- An underlying assumption in
FIG. 2 is thatdata store 14 is separate fromdata store 16 such that even if an attacker were to gain access todata store 14, the attacker does not automatically also gain access todata store 16. In one embodiment of the invention,data store 14 may be physically separated from data store 16 (e.g.,data store 14 anddata store 16 may be separate devices and/or may be separated by network 28). - In a variation of
FIG. 2 ,phishing simulator 24 may be directly coupled to forwarding device 30 (i.e.,network 28 is not present). In such embodiment, the mapping present indata store 14 and the mapping present indata store 16 may be stored in a common data storage device. To thwart an attacker from gaining knowledge of the association between phishing simulation records and primary e-mail addresses (and subsequently attacking individuals who participate in the training program), the mapping from e-mail aliases to primary e-mail addresses may be stored in an encrypted manner. As such, even if in attacker were to gain access to the phishing simulation records, the attacker will be unable to contact individuals associated with the phishing simulation records (assuming that the e-mail aliases have been rendered invalid). - In the discussion above, references have been made to a “training program”. Such “training program” may include one or more of the components of
FIG. 2 : phishing simulation records 12,record selector 22,phishing simulator 24,phishing simulations 26 and simulation record toalias mapping 14. Forwardingdevice 30 and e-mail alias toprimary e-mail mapping 16 may be present in a mail server which is coupled to the training program vianetwork 28. -
FIG. 4 depicts flow diagram 70 of a process to administer phishing simulations to individuals via e-mail aliases, according to one embodiment of the invention. Atstep 72, for each individual, a phishing simulation record of the individual may be associated with an e-mail alias of the individual. Such association may be recorded indata store 14, as described above. Atstep 74,record selector 22 may determine which of the phishing simulation records satisfies a criterion. Atstep 76,record selector 22 may select at least one of the phishing simulation records which satisfies the criterion. Finally, atstep 78,phishing simulator 24 may, for each of the selected phishing simulation records, send one or more messages to the individual associated with the selected phishing simulation record via that individual's e-mail alias. - While embodiments of the present invention have been described in the context of preventing an attacker from maliciously using phishing simulation records, there may be other contexts for which decoupling a phishing simulation record from an individual's personal/contact information using an e-mail alias would be beneficial. For instance, privacy laws or a company's Chief Privacy Officer may want to preclude phishing susceptibility attribution. That is, a company's objective is typically to reduce its employees' susceptibility to phishing attacks, and not necessarily to specifically know who is most susceptible.
- As is apparent from the foregoing discussion, aspects of the present invention involve the use of various computer systems and computer readable storage media having computer-readable instructions stored thereon.
FIG. 5 provides an example ofcomputer system 100 that is representative of any of the client/server devices discussed herein. Further,computer system 100 is representative of a device that performs the process depicted inFIG. 4 . Note, not all of the various devices discussed herein may have all of the features ofcomputer system 100. For example, certain devices discussed above may not include a display inasmuch as the display function may be provided by a client computer communicatively coupled tocomputer system 100 or a display function may be unnecessary. Such details are not critical to the present invention. -
Computer system 100 includes a bus 102 or other communication mechanism for communicating information, and aprocessor 104 coupled with the bus 102 for processing information.Computer system 100 also includes amain memory 106, such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 102 for storing information and instructions to be executed byprocessor 104.Main memory 106 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed byprocessor 104.Computer system 100 further includes a read only memory (ROM) 108 or other static storage device coupled to the bus 102 for storing static information and instructions for theprocessor 104. Astorage device 110, which may be one or more of a floppy disk, a flexible disk, a hard disk, flash memory-based storage medium, magnetic tape or other magnetic storage medium, a compact disk (CD)-ROM, a digital versatile disk (DVD)-ROM, or other optical storage medium, or any other storage medium from whichprocessor 104 can read, is provided and coupled to the bus 102 for storing information and instructions (e.g., operating systems, applications programs and the like). -
Computer system 100 may be coupled via the bus 102 to a display 112, such as a flat panel display, for displaying information to a computer user. Aninput device 114, such as a keyboard including alphanumeric and other keys, is coupled to the bus 102 for communicating information and command selections to theprocessor 104. Another type of user input device iscursor control device 116, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections toprocessor 104 and for controlling cursor movement on the display 112. Other user interface devices, such as microphones, speakers, etc. are not shown in detail but may be involved with the receipt of user input and/or presentation of output. - The processes referred to herein may be implemented by
processor 104 executing appropriate sequences of computer-readable instructions contained inmain memory 106. Such instructions may be read intomain memory 106 from another computer-readable medium, such asstorage device 110, and execution of the sequences of instructions contained in themain memory 106 causes theprocessor 104 to perform the associated actions. In alternative embodiments, hard-wired circuitry or firmware-controlled processing units (e.g., field programmable gate arrays) may be used in place of or in combination withprocessor 104 and its associated computer software instructions to implement the invention. The computer-readable instructions may be rendered in any computer language including, without limitation, C#, C/C++, Fortran, COBOL, PASCAL, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), Java™ and the like. In general, all of the aforementioned terms are meant to encompass any series of logical steps performed in a sequence to accomplish a given purpose, which is the hallmark of any computer-executable application. Unless specifically stated otherwise, it should be appreciated that throughout the description of the present invention, use of terms such as “processing”, “computing”, “calculating”, “determining”, “displaying” or the like, refer to the action and processes of an appropriately programmed computer system, such ascomputer system 100 or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within its registers and memories into other data similarly represented as physical quantities within its memories or registers or other such information storage, transmission or display devices. -
Computer system 100 also includes acommunication interface 118 coupled to the bus 102.Communication interface 118 provides a two-way data communication channel with a computer network, which provides connectivity to and among the various computer systems discussed above. For example,communication interface 118 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN, which itself is communicatively coupled to the Internet through one or more Internet service provider networks. The precise details of such communication paths are not critical to the present invention. What is important is thatcomputer system 100 can send and receive messages and data through thecommunication interface 118 and in that way communicate with hosts accessible via the Internet. - Thus, methods, network devices and machine-readable media for preventing malicious use of phishing simulation records have been described. It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
Claims (24)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/044,099 US20160164898A1 (en) | 2014-01-21 | 2016-02-15 | Simulated phishing result anonymization and attribution prevention |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/160,443 US9262629B2 (en) | 2014-01-21 | 2014-01-21 | Methods and systems for preventing malicious use of phishing simulation records |
US15/044,099 US20160164898A1 (en) | 2014-01-21 | 2016-02-15 | Simulated phishing result anonymization and attribution prevention |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/160,443 Continuation US9262629B2 (en) | 2014-01-21 | 2014-01-21 | Methods and systems for preventing malicious use of phishing simulation records |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160164898A1 true US20160164898A1 (en) | 2016-06-09 |
Family
ID=52001846
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/160,443 Active 2034-05-10 US9262629B2 (en) | 2014-01-21 | 2014-01-21 | Methods and systems for preventing malicious use of phishing simulation records |
US14/223,820 Active - Reinstated US8910287B1 (en) | 2014-01-21 | 2014-03-24 | Methods and systems for preventing malicious use of phishing simulation records |
US15/044,099 Abandoned US20160164898A1 (en) | 2014-01-21 | 2016-02-15 | Simulated phishing result anonymization and attribution prevention |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/160,443 Active 2034-05-10 US9262629B2 (en) | 2014-01-21 | 2014-01-21 | Methods and systems for preventing malicious use of phishing simulation records |
US14/223,820 Active - Reinstated US8910287B1 (en) | 2014-01-21 | 2014-03-24 | Methods and systems for preventing malicious use of phishing simulation records |
Country Status (1)
Country | Link |
---|---|
US (3) | US9262629B2 (en) |
Cited By (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9813454B2 (en) | 2014-08-01 | 2017-11-07 | Wombat Security Technologies, Inc. | Cybersecurity training system with automated application of branded content |
US9906555B1 (en) | 2017-04-06 | 2018-02-27 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US20180159888A1 (en) * | 2016-10-31 | 2018-06-07 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US10009375B1 (en) | 2017-12-01 | 2018-06-26 | KnowBe4, Inc. | Systems and methods for artificial model building techniques |
US10021126B2 (en) | 2016-02-26 | 2018-07-10 | KnowBe4, Inc. | Systems and methods for creating and running heterogeneous phishing attack campaigns |
US10165006B2 (en) | 2017-01-05 | 2018-12-25 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US10237302B1 (en) | 2018-03-20 | 2019-03-19 | KnowBe4, Inc. | System and methods for reverse vishing and point of failure remedial training |
US10243904B1 (en) | 2017-05-26 | 2019-03-26 | Wombat Security Technologies, Inc. | Determining authenticity of reported user action in cybersecurity risk assessment |
US10257225B1 (en) | 2017-12-01 | 2019-04-09 | KnowBe4, Inc. | Systems and methods for artificial intelligence driven agent campaign controller |
US10291649B2 (en) | 2016-06-28 | 2019-05-14 | KnowBe4, Inc. | Systems and methods for performing a simulated phishing attack |
US10313387B1 (en) * | 2017-12-01 | 2019-06-04 | KnowBe4, Inc. | Time based triggering of dynamic templates |
US10332003B1 (en) | 2017-12-01 | 2019-06-25 | KnowBe4, Inc. | Systems and methods for AIDA based analytics and reporting |
US10348762B2 (en) | 2017-12-01 | 2019-07-09 | KnowBe4, Inc. | Systems and methods for serving module |
US10348761B2 (en) | 2017-12-01 | 2019-07-09 | KnowBe4, Inc. | Systems and methods for situational localization of AIDA |
US10362047B2 (en) | 2017-05-08 | 2019-07-23 | KnowBe4, Inc. | Systems and methods for providing user interfaces based on actions associated with untrusted emails |
US10540493B1 (en) | 2018-09-19 | 2020-01-21 | KnowBe4, Inc. | System and methods for minimizing organization risk from users associated with a password breach |
US10581910B2 (en) | 2017-12-01 | 2020-03-03 | KnowBe4, Inc. | Systems and methods for AIDA based A/B testing |
US10581868B2 (en) | 2017-04-21 | 2020-03-03 | KnowBe4, Inc. | Using smart groups for computer-based security awareness training systems |
US10657248B2 (en) | 2017-07-31 | 2020-05-19 | KnowBe4, Inc. | Systems and methods for using attribute data for system protection and security awareness training |
US10673894B2 (en) | 2018-09-26 | 2020-06-02 | KnowBe4, Inc. | System and methods for spoofed domain identification and user training |
US10673876B2 (en) | 2018-05-16 | 2020-06-02 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US10673895B2 (en) | 2017-12-01 | 2020-06-02 | KnowBe4, Inc. | Systems and methods for AIDA based grouping |
US10679164B2 (en) | 2017-12-01 | 2020-06-09 | KnowBe4, Inc. | Systems and methods for using artificial intelligence driven agent to automate assessment of organizational vulnerabilities |
US10715549B2 (en) | 2017-12-01 | 2020-07-14 | KnowBe4, Inc. | Systems and methods for AIDA based role models |
US10812527B2 (en) | 2017-12-01 | 2020-10-20 | KnowBe4, Inc. | Systems and methods for aida based second chance |
US10812507B2 (en) | 2018-12-15 | 2020-10-20 | KnowBe4, Inc. | System and methods for efficient combining of malware detection rules |
US10839083B2 (en) | 2017-12-01 | 2020-11-17 | KnowBe4, Inc. | Systems and methods for AIDA campaign controller intelligent records |
US10917429B1 (en) * | 2020-08-24 | 2021-02-09 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US10949760B1 (en) | 2020-08-28 | 2021-03-16 | KnowBe4, Inc. | Systems and methods for adaptation of SCORM packages at runtime with an extended LMS |
US10979448B2 (en) | 2018-11-02 | 2021-04-13 | KnowBe4, Inc. | Systems and methods of cybersecurity attack simulation for incident response training and awareness |
US10992699B1 (en) | 2020-06-19 | 2021-04-27 | KnowBe4, Inc. | Systems and methods for determining a job score from a job title |
US11108821B2 (en) | 2019-05-01 | 2021-08-31 | KnowBe4, Inc. | Systems and methods for use of address fields in a simulated phishing attack |
US11108822B2 (en) | 2019-09-10 | 2021-08-31 | KnowBe4, Inc. | Systems and methods for simulated phishing attacks involving message threads |
US11269994B2 (en) | 2020-02-07 | 2022-03-08 | KnowBe4, Inc. | Systems and methods for providing configurable responses to threat identification |
US11297095B1 (en) | 2020-10-30 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for determination of level of security to apply to a group before display of user data |
US11295010B2 (en) | 2017-07-31 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for using attribute data for system protection and security awareness training |
US11336688B2 (en) | 2020-05-22 | 2022-05-17 | KnowBe4, Inc. | Systems and methods for end-user security awareness training for calendar-based threats |
US11343276B2 (en) | 2017-07-13 | 2022-05-24 | KnowBe4, Inc. | Systems and methods for discovering and alerting users of potentially hazardous messages |
US11356480B2 (en) | 2020-08-26 | 2022-06-07 | KnowBe4, Inc. | Systems and methods of simulated phishing campaign contextualization |
US11381541B2 (en) | 2020-02-07 | 2022-07-05 | KnowBe4, Inc. | Systems and methods for communication with a third-party email client plug-in |
US11496514B2 (en) | 2020-07-31 | 2022-11-08 | KnowBe4, Inc. | Systems and methods for security awareness using ad-based simulated phishing attacks |
US11552984B2 (en) | 2020-12-10 | 2023-01-10 | KnowBe4, Inc. | Systems and methods for improving assessment of security risk based on personal internet account data |
US11563767B1 (en) | 2021-09-02 | 2023-01-24 | KnowBe4, Inc. | Automated effective template generation |
US11599838B2 (en) | 2017-06-20 | 2023-03-07 | KnowBe4, Inc. | Systems and methods for creating and commissioning a security awareness program |
US11625689B2 (en) | 2020-04-02 | 2023-04-11 | KnowBe4, Inc. | Systems and methods for human resources applications of security awareness testing |
US11641375B2 (en) | 2020-04-29 | 2023-05-02 | KnowBe4, Inc. | Systems and methods for reporting based simulated phishing campaign |
US11777986B2 (en) | 2017-12-01 | 2023-10-03 | KnowBe4, Inc. | Systems and methods for AIDA based exploit selection |
US11870806B1 (en) * | 2019-10-31 | 2024-01-09 | Rapid7, Inc. | Phishing attack training systems and methods |
US11997136B2 (en) | 2022-11-07 | 2024-05-28 | KnowBe4, Inc. | Systems and methods for security awareness using ad-based simulated phishing attacks |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9824609B2 (en) | 2011-04-08 | 2017-11-21 | Wombat Security Technologies, Inc. | Mock attack cybersecurity training system and methods |
US9558677B2 (en) | 2011-04-08 | 2017-01-31 | Wombat Security Technologies, Inc. | Mock attack cybersecurity training system and methods |
US10749887B2 (en) | 2011-04-08 | 2020-08-18 | Proofpoint, Inc. | Assessing security risks of users in a computing network |
US8966637B2 (en) | 2013-02-08 | 2015-02-24 | PhishMe, Inc. | Performance benchmarking for simulated phishing attacks |
US9253207B2 (en) | 2013-02-08 | 2016-02-02 | PhishMe, Inc. | Collaborative phishing attack detection |
US9356948B2 (en) | 2013-02-08 | 2016-05-31 | PhishMe, Inc. | Collaborative phishing attack detection |
US9398038B2 (en) | 2013-02-08 | 2016-07-19 | PhishMe, Inc. | Collaborative phishing attack detection |
US9262629B2 (en) | 2014-01-21 | 2016-02-16 | PhishMe, Inc. | Methods and systems for preventing malicious use of phishing simulation records |
US9882932B2 (en) * | 2014-04-02 | 2018-01-30 | Deep Detection, Llc | Automated spear phishing system |
EP3254258A4 (en) * | 2015-02-05 | 2018-09-19 | Phishline, LLC | Social engineering simulation workflow appliance |
AU2016215226A1 (en) | 2015-02-05 | 2017-08-17 | Phishline, Llc | Social engineering simulation workflow appliance |
US9906539B2 (en) | 2015-04-10 | 2018-02-27 | PhishMe, Inc. | Suspicious message processing and incident response |
US9635052B2 (en) * | 2015-05-05 | 2017-04-25 | Christopher J. HADNAGY | Phishing as-a-service (PHaas) used to increase corporate security awareness |
CN106200891B (en) | 2015-05-08 | 2019-09-06 | 阿里巴巴集团控股有限公司 | Show the method, apparatus and system of user interface |
US10110623B2 (en) | 2015-07-22 | 2018-10-23 | Bank Of America Corporation | Delaying phishing communication |
US9749359B2 (en) | 2015-07-22 | 2017-08-29 | Bank Of America Corporation | Phishing campaign ranker |
US9942249B2 (en) | 2015-07-22 | 2018-04-10 | Bank Of America Corporation | Phishing training tool |
US9729573B2 (en) | 2015-07-22 | 2017-08-08 | Bank Of America Corporation | Phishing campaign ranker |
US9825974B2 (en) | 2015-07-22 | 2017-11-21 | Bank Of America Corporation | Phishing warning tool |
US10110628B2 (en) | 2015-07-22 | 2018-10-23 | Bank Of America Corporation | Phishing source tool |
US11010717B2 (en) * | 2016-06-21 | 2021-05-18 | The Prudential Insurance Company Of America | Tool for improving network security |
GB2553427B (en) * | 2016-08-02 | 2021-09-15 | Sophos Ltd | Identifying and remediating phishing security weaknesses |
US9774626B1 (en) | 2016-08-17 | 2017-09-26 | Wombat Security Technologies, Inc. | Method and system for assessing and classifying reported potentially malicious messages in a cybersecurity system |
US9781149B1 (en) | 2016-08-17 | 2017-10-03 | Wombat Security Technologies, Inc. | Method and system for reducing reporting of non-malicious electronic messages in a cybersecurity system |
US9912687B1 (en) | 2016-08-17 | 2018-03-06 | Wombat Security Technologies, Inc. | Advanced processing of electronic messages with attachments in a cybersecurity system |
US9876753B1 (en) | 2016-12-22 | 2018-01-23 | Wombat Security Technologies, Inc. | Automated message security scanner detection system |
US10708297B2 (en) | 2017-08-25 | 2020-07-07 | Ecrime Management Strategies, Inc. | Security system for detection and mitigation of malicious communications |
US10924517B2 (en) | 2018-02-07 | 2021-02-16 | Sophos Limited | Processing network traffic based on assessed security weaknesses |
CN110119433B (en) * | 2019-05-13 | 2021-06-08 | 上海连尚网络科技有限公司 | Method and apparatus for predicting gender |
US11240272B2 (en) | 2019-07-24 | 2022-02-01 | Bank Of America Corporation | User responses to cyber security threats |
CN111770086B (en) * | 2020-06-28 | 2023-10-10 | 深圳前海微众银行股份有限公司 | Fishing user simulation collection method, device, system and computer readable storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060218000A1 (en) * | 2005-03-24 | 2006-09-28 | Smith Gregory P | System and method for providing collaboration communities in a computer portal environment |
US20140222928A1 (en) * | 2013-02-06 | 2014-08-07 | Msc Intellectual Properties B.V. | System and method for authorship disambiguation and alias resolution in electronic data |
Family Cites Families (102)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5892903A (en) | 1996-09-12 | 1999-04-06 | Internet Security Systems, Inc. | Method and apparatus for detecting and identifying security vulnerabilities in an open network computer communication system |
US6954858B1 (en) | 1999-12-22 | 2005-10-11 | Kimberly Joyce Welborn | Computer virus avoidance system and mechanism |
US7281031B1 (en) | 2000-03-22 | 2007-10-09 | Emc Corporation | Method and apparatus for providing additional resources for a host computer |
US20110238855A1 (en) | 2000-09-25 | 2011-09-29 | Yevgeny Korsunsky | Processing data flows with a data flow processor |
US20020091940A1 (en) | 2001-01-05 | 2002-07-11 | Welborn Christopher Michael | E-mail user behavior modification system and mechanism for computer virus avoidance |
US7603709B2 (en) | 2001-05-03 | 2009-10-13 | Computer Associates Think, Inc. | Method and apparatus for predicting and preventing attacks in communications networks |
US7325252B2 (en) | 2001-05-18 | 2008-01-29 | Achilles Guard Inc. | Network security testing |
US7509675B2 (en) | 2002-05-29 | 2009-03-24 | At&T Intellectual Property I, L.P. | Non-invasive monitoring of the effectiveness of electronic security services |
US8407798B1 (en) | 2002-10-01 | 2013-03-26 | Skybox Secutiry Inc. | Method for simulation aided security event management |
US7152244B2 (en) | 2002-12-31 | 2006-12-19 | American Online, Inc. | Techniques for detecting and preventing unintentional disclosures of sensitive data |
US7685631B1 (en) | 2003-02-05 | 2010-03-23 | Microsoft Corporation | Authentication of a server by a client to prevent fraudulent user interfaces |
US7451487B2 (en) | 2003-09-08 | 2008-11-11 | Sonicwall, Inc. | Fraudulent message detection |
US7373385B2 (en) | 2003-11-03 | 2008-05-13 | Cloudmark, Inc. | Method and apparatus to block spam based on spam reports from a community of users |
US9076132B2 (en) | 2003-11-07 | 2015-07-07 | Emc Corporation | System and method of addressing email and electronic communication fraud |
US20050132225A1 (en) | 2003-12-16 | 2005-06-16 | Glenn Gearhart | Method and system for cyber-security vulnerability detection and compliance measurement (CDCM) |
US20050183143A1 (en) | 2004-02-13 | 2005-08-18 | Anderholm Eric J. | Methods and systems for monitoring user, application or device activity |
US7971246B1 (en) | 2004-04-29 | 2011-06-28 | James A. Roskind | Identity theft countermeasures |
US8041769B2 (en) | 2004-05-02 | 2011-10-18 | Markmonitor Inc. | Generating phish messages |
US20070107053A1 (en) | 2004-05-02 | 2007-05-10 | Markmonitor, Inc. | Enhanced responses to online fraud |
US7457823B2 (en) | 2004-05-02 | 2008-11-25 | Markmonitor Inc. | Methods and systems for analyzing data related to possible online fraud |
US7748040B2 (en) | 2004-07-12 | 2010-06-29 | Architecture Technology Corporation | Attack correlation using marked information |
US7490356B2 (en) | 2004-07-20 | 2009-02-10 | Reflectent Software, Inc. | End user risk management |
WO2006036763A2 (en) | 2004-09-22 | 2006-04-06 | Cyberdefender Corporation | System for distributing information using a secure peer-to-peer network |
US20060168066A1 (en) | 2004-11-10 | 2006-07-27 | David Helsper | Email anti-phishing inspector |
US8291065B2 (en) | 2004-12-02 | 2012-10-16 | Microsoft Corporation | Phishing detection, prevention, and notification |
US7634810B2 (en) | 2004-12-02 | 2009-12-15 | Microsoft Corporation | Phishing detection, prevention, and notification |
ATE548841T1 (en) | 2005-01-14 | 2012-03-15 | Bae Systems Plc | NETWORK BASED SECURITY SYSTEM |
US7617532B1 (en) | 2005-01-24 | 2009-11-10 | Symantec Corporation | Protection of sensitive data from malicious e-mail |
US20060174119A1 (en) | 2005-02-03 | 2006-08-03 | Xin Xu | Authenticating destinations of sensitive data in web browsing |
US7904518B2 (en) | 2005-02-15 | 2011-03-08 | Gytheion Networks Llc | Apparatus and method for analyzing and filtering email and for providing web related services |
JP2006285844A (en) | 2005-04-04 | 2006-10-19 | Katsuyoshi Nagashima | Phishing fraud prevention system |
US7841003B1 (en) | 2005-05-04 | 2010-11-23 | Capital One Financial Corporation | Phishing solution method |
US7788723B2 (en) | 2005-05-17 | 2010-08-31 | Computer Associates Think, Inc. | Method and apparatus for identifying computer vulnerabilities using exploit probes and remote scanning |
US20060271631A1 (en) | 2005-05-25 | 2006-11-30 | Microsoft Corporation | Categorizing mails by safety level |
US7681234B2 (en) | 2005-06-30 | 2010-03-16 | Microsoft Corporation | Preventing phishing attacks |
US8181232B2 (en) | 2005-07-29 | 2012-05-15 | Citicorp Development Center, Inc. | Methods and systems for secure user authentication |
US8726353B2 (en) | 2005-11-01 | 2014-05-13 | Qinetiq Limited | Secure computer use system |
US20070136806A1 (en) | 2005-12-14 | 2007-06-14 | Aladdin Knowledge Systems Ltd. | Method and system for blocking phishing scams |
US8839418B2 (en) | 2006-01-18 | 2014-09-16 | Microsoft Corporation | Finding phishing sites |
US20080288303A1 (en) | 2006-03-17 | 2008-11-20 | Claria Corporation | Method for Detecting and Preventing Fraudulent Internet Advertising Activity |
US20070245422A1 (en) | 2006-04-18 | 2007-10-18 | Softrun, Inc. | Phishing-Prevention Method Through Analysis of Internet Website to be Accessed and Storage Medium Storing Computer Program Source for Executing the Same |
US8775919B2 (en) | 2006-04-25 | 2014-07-08 | Adobe Systems Incorporated | Independent actionscript analytics tools and techniques |
US7668921B2 (en) | 2006-05-30 | 2010-02-23 | Xerox Corporation | Method and system for phishing detection |
FR2902546B1 (en) | 2006-06-16 | 2008-12-26 | Olfeo Sarl | METHOD AND SYSTEM FOR PROCESSING SECURITY DATA OF A COMPUTER NETWORK. |
US20080047017A1 (en) | 2006-06-23 | 2008-02-21 | Martin Renaud | System and method for dynamically assessing security risks attributed to a computer user's behavior |
US8220047B1 (en) | 2006-08-09 | 2012-07-10 | Google Inc. | Anti-phishing system and method |
US20080037791A1 (en) | 2006-08-09 | 2008-02-14 | Jakobsson Bjorn M | Method and apparatus for evaluating actions performed on a client device |
US20080037583A1 (en) * | 2006-08-09 | 2008-02-14 | Postini, Inc. | Unified management policy for multiple format electronic communications |
US9177314B2 (en) * | 2006-08-14 | 2015-11-03 | Chijioke Chukwuemeka UZO | Method of making secure electronic payments using communications devices and biometric data |
US7987495B2 (en) | 2006-12-26 | 2011-07-26 | Computer Associates Think, Inc. | System and method for multi-context policy management |
US8468244B2 (en) | 2007-01-05 | 2013-06-18 | Digital Doors, Inc. | Digital information infrastructure and method for security designated data and with granular data stores |
US8209381B2 (en) | 2007-01-19 | 2012-06-26 | Yahoo! Inc. | Dynamic combatting of SPAM and phishing attacks |
US8327421B2 (en) * | 2007-01-30 | 2012-12-04 | Imprivata, Inc. | System and method for identity consolidation |
US20080288330A1 (en) | 2007-05-14 | 2008-11-20 | Sailpoint Technologies, Inc. | System and method for user access risk scoring |
US8464346B2 (en) | 2007-05-24 | 2013-06-11 | Iviz Techno Solutions Pvt. Ltd | Method and system simulating a hacking attack on a network |
US8528064B2 (en) | 2007-06-22 | 2013-09-03 | Springo Incorporated | Web based system that allows users to log into websites without entering username and password information |
US8849909B2 (en) | 2007-07-06 | 2014-09-30 | Yahoo! Inc. | Real-time asynchronous event aggregation systems |
US20090089859A1 (en) | 2007-09-28 | 2009-04-02 | Cook Debra L | Method and apparatus for detecting phishing attempts solicited by electronic mail |
US8286243B2 (en) | 2007-10-23 | 2012-10-09 | International Business Machines Corporation | Blocking intrusion attacks at an offending host |
US8608487B2 (en) | 2007-11-29 | 2013-12-17 | Bank Of America Corporation | Phishing redirect for consumer education: fraud detection |
US8332918B2 (en) | 2007-12-06 | 2012-12-11 | Novell, Inc. | Techniques for real-time adaptive password policies |
US8645516B2 (en) | 2008-02-22 | 2014-02-04 | Accenture Global Services Limited | System for analyzing user activity in a collaborative environment |
US8171559B2 (en) | 2008-03-13 | 2012-05-01 | International Business Machines Corporation | Detecting a phishing entity in a virtual universe |
US8365246B2 (en) | 2008-03-18 | 2013-01-29 | International Business Machines Corporation | Protecting confidential information on network sites based on security awareness |
US9130986B2 (en) | 2008-03-19 | 2015-09-08 | Websense, Inc. | Method and system for protection against information stealing software |
US20090259725A1 (en) | 2008-04-14 | 2009-10-15 | Case Western Reserve University | Email consumer reputation |
US8321934B1 (en) | 2008-05-05 | 2012-11-27 | Symantec Corporation | Anti-phishing early warning system based on end user data submission statistics |
US9123027B2 (en) | 2010-10-19 | 2015-09-01 | QinetiQ North America, Inc. | Social engineering protection appliance |
US20090282112A1 (en) | 2008-05-12 | 2009-11-12 | Cloudmark, Inc. | Spam identification system |
US8423483B2 (en) | 2008-05-16 | 2013-04-16 | Carnegie Mellon University | User-controllable learning of policies |
US20090319247A1 (en) | 2008-06-18 | 2009-12-24 | Eads Na Defense Security And Systems Solutions Inc | Systems and Methods for A Simulated Network Environment and Operation Thereof |
US20090328208A1 (en) | 2008-06-30 | 2009-12-31 | International Business Machines | Method and apparatus for preventing phishing attacks |
US20100042687A1 (en) | 2008-08-12 | 2010-02-18 | Yahoo! Inc. | System and method for combating phishing |
US8321516B2 (en) | 2008-09-30 | 2012-11-27 | Aol Inc. | Systems and methods for creating and updating reputation records |
US20100125911A1 (en) | 2008-11-17 | 2010-05-20 | Prakash Bhaskaran | Risk Scoring Based On Endpoint User Activities |
US20100154055A1 (en) | 2008-12-12 | 2010-06-17 | At&T Intellectual Property I, L.P. | Prefix Domain Matching for Anti-Phishing Pattern Matching |
US20100205014A1 (en) * | 2009-02-06 | 2010-08-12 | Cary Sholer | Method and system for providing response services |
US20100211641A1 (en) | 2009-02-16 | 2010-08-19 | Microsoft Corporation | Personalized email filtering |
US8429751B2 (en) | 2009-03-13 | 2013-04-23 | Trustwave Holdings, Inc. | Method and apparatus for phishing and leeching vulnerability detection |
US8296376B2 (en) | 2009-03-26 | 2012-10-23 | International Business Machines Corporation | Utilizing E-mail response time statistics for more efficient and effective user communication |
US8769695B2 (en) | 2009-04-30 | 2014-07-01 | Bank Of America Corporation | Phish probability scoring model |
US8356001B2 (en) | 2009-05-19 | 2013-01-15 | Xybersecure, Inc. | Systems and methods for application-level security |
US8438642B2 (en) | 2009-06-05 | 2013-05-07 | At&T Intellectual Property I, L.P. | Method of detecting potential phishing by analyzing universal resource locators |
US20110030059A1 (en) | 2009-07-30 | 2011-02-03 | Greenwald Lloyd G | Method for testing the security posture of a system |
US20110035317A1 (en) | 2009-08-07 | 2011-02-10 | Mark Carlson | Seedless anti phishing authentication using transaction history |
GB2461422B (en) | 2009-09-01 | 2010-12-08 | Postalguard Ltd | Method for Detecting and Blocking Phishing Attacks |
US9742778B2 (en) | 2009-09-09 | 2017-08-22 | International Business Machines Corporation | Differential security policies in email systems |
US10157280B2 (en) | 2009-09-23 | 2018-12-18 | F5 Networks, Inc. | System and method for identifying security breach attempts of a website |
US20110093546A1 (en) | 2009-10-15 | 2011-04-21 | Bryan Rubingh | Method and system for sorting electronic communications |
US8271007B2 (en) | 2010-01-06 | 2012-09-18 | Alcatel Lucent | Managing SMS spoofing using SMPP protocol |
US9038187B2 (en) | 2010-01-26 | 2015-05-19 | Bank Of America Corporation | Insider threat correlation tool |
US8910279B2 (en) | 2010-03-10 | 2014-12-09 | Sonicwall, Inc. | Reputation-based threat protection |
US8793799B2 (en) * | 2010-11-16 | 2014-07-29 | Booz, Allen & Hamilton | Systems and methods for identifying and mitigating information security risks |
US9547998B2 (en) * | 2011-04-08 | 2017-01-17 | Wombat Security Technologies, Inc. | Context-aware training systems, apparatuses, and methods |
JP5691853B2 (en) | 2011-06-02 | 2015-04-01 | 富士通株式会社 | Access monitoring program, information processing apparatus, and access monitoring method |
CN102902917A (en) | 2011-07-29 | 2013-01-30 | 国际商业机器公司 | Method and system for preventing phishing attacks |
US8484741B1 (en) * | 2012-01-27 | 2013-07-09 | Chapman Technology Group, Inc. | Software service to facilitate organizational testing of employees to determine their potential susceptibility to phishing scams |
RU2510982C2 (en) | 2012-04-06 | 2014-04-10 | Закрытое акционерное общество "Лаборатория Касперского" | User evaluation system and method for message filtering |
US9253207B2 (en) | 2013-02-08 | 2016-02-02 | PhishMe, Inc. | Collaborative phishing attack detection |
US9053326B2 (en) * | 2013-02-08 | 2015-06-09 | PhishMe, Inc. | Simulated phishing attack with sequential messages |
US8966637B2 (en) | 2013-02-08 | 2015-02-24 | PhishMe, Inc. | Performance benchmarking for simulated phishing attacks |
US9262629B2 (en) | 2014-01-21 | 2016-02-16 | PhishMe, Inc. | Methods and systems for preventing malicious use of phishing simulation records |
-
2014
- 2014-01-21 US US14/160,443 patent/US9262629B2/en active Active
- 2014-03-24 US US14/223,820 patent/US8910287B1/en active Active - Reinstated
-
2016
- 2016-02-15 US US15/044,099 patent/US20160164898A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060218000A1 (en) * | 2005-03-24 | 2006-09-28 | Smith Gregory P | System and method for providing collaboration communities in a computer portal environment |
US20140222928A1 (en) * | 2013-02-06 | 2014-08-07 | Msc Intellectual Properties B.V. | System and method for authorship disambiguation and alias resolution in electronic data |
Cited By (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9813454B2 (en) | 2014-08-01 | 2017-11-07 | Wombat Security Technologies, Inc. | Cybersecurity training system with automated application of branded content |
US10469519B2 (en) | 2016-02-26 | 2019-11-05 | KnowBe4, Inc | Systems and methods for performing of creating simulated phishing attacks and phishing attack campaigns |
US11777977B2 (en) | 2016-02-26 | 2023-10-03 | KnowBe4, Inc. | Systems and methods for performing or creating simulated phishing attacks and phishing attack campaigns |
US10021126B2 (en) | 2016-02-26 | 2018-07-10 | KnowBe4, Inc. | Systems and methods for creating and running heterogeneous phishing attack campaigns |
US10855716B2 (en) | 2016-02-26 | 2020-12-01 | KnowBe4, Inc. | Systems and methods for performing or creating simulated phishing attacks and phishing attack campaigns |
US10291649B2 (en) | 2016-06-28 | 2019-05-14 | KnowBe4, Inc. | Systems and methods for performing a simulated phishing attack |
US10826937B2 (en) | 2016-06-28 | 2020-11-03 | KnowBe4, Inc. | Systems and methods for performing a simulated phishing attack |
US11552991B2 (en) | 2016-06-28 | 2023-01-10 | KnowBe4, Inc. | Systems and methods for performing a simulated phishing attack |
US11431747B2 (en) | 2016-10-31 | 2022-08-30 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven agent |
US10855714B2 (en) | 2016-10-31 | 2020-12-01 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven agent |
US11616801B2 (en) | 2016-10-31 | 2023-03-28 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US20180159888A1 (en) * | 2016-10-31 | 2018-06-07 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US11632387B2 (en) | 2016-10-31 | 2023-04-18 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US11075943B2 (en) | 2016-10-31 | 2021-07-27 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven agent |
US10880325B2 (en) | 2016-10-31 | 2020-12-29 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US10764317B2 (en) * | 2016-10-31 | 2020-09-01 | KnowBe4, Inc. | Systems and methods for an artificial intelligence driven smart template |
US10165006B2 (en) | 2017-01-05 | 2018-12-25 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US11936688B2 (en) | 2017-01-05 | 2024-03-19 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US11070587B2 (en) | 2017-01-05 | 2021-07-20 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US11601470B2 (en) | 2017-01-05 | 2023-03-07 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US10581912B2 (en) | 2017-01-05 | 2020-03-03 | KnowBe4, Inc. | Systems and methods for performing simulated phishing attacks using social engineering indicators |
US11792225B2 (en) | 2017-04-06 | 2023-10-17 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US10581911B2 (en) | 2017-04-06 | 2020-03-03 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US10715551B1 (en) | 2017-04-06 | 2020-07-14 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US10158668B2 (en) | 2017-04-06 | 2018-12-18 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US11489869B2 (en) | 2017-04-06 | 2022-11-01 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US9906555B1 (en) | 2017-04-06 | 2018-02-27 | KnowBe4, Inc. | Systems and methods for subscription management of specific classification groups based on user's actions |
US10581868B2 (en) | 2017-04-21 | 2020-03-03 | KnowBe4, Inc. | Using smart groups for computer-based security awareness training systems |
US10812493B2 (en) | 2017-04-21 | 2020-10-20 | KnowBe4, Inc. | Using smart groups for computer-based security awareness training systems |
US11349849B2 (en) | 2017-04-21 | 2022-05-31 | KnowBe4, Inc. | Using smart groups for computer-based security awareness training systems |
US11122051B2 (en) | 2017-04-21 | 2021-09-14 | KnowBe4, Inc. | Using smart groups for computer-based security awareness training systems |
US11240261B2 (en) | 2017-05-08 | 2022-02-01 | KnowBe4, Inc. | Systems and methods for providing user interfaces based on actions associated with untrusted emails |
US11930028B2 (en) | 2017-05-08 | 2024-03-12 | KnowBe4, Inc. | Systems and methods for providing user interfaces based on actions associated with untrusted emails |
US10659487B2 (en) | 2017-05-08 | 2020-05-19 | KnowBe4, Inc. | Systems and methods for providing user interfaces based on actions associated with untrusted emails |
US10362047B2 (en) | 2017-05-08 | 2019-07-23 | KnowBe4, Inc. | Systems and methods for providing user interfaces based on actions associated with untrusted emails |
US10243904B1 (en) | 2017-05-26 | 2019-03-26 | Wombat Security Technologies, Inc. | Determining authenticity of reported user action in cybersecurity risk assessment |
US10778626B2 (en) | 2017-05-26 | 2020-09-15 | Proofpoint, Inc. | Determining authenticity of reported user action in cybersecurity risk assessment |
US11599838B2 (en) | 2017-06-20 | 2023-03-07 | KnowBe4, Inc. | Systems and methods for creating and commissioning a security awareness program |
US11343276B2 (en) | 2017-07-13 | 2022-05-24 | KnowBe4, Inc. | Systems and methods for discovering and alerting users of potentially hazardous messages |
US11295010B2 (en) | 2017-07-31 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for using attribute data for system protection and security awareness training |
US11847208B2 (en) | 2017-07-31 | 2023-12-19 | KnowBe4, Inc. | Systems and methods for using attribute data for system protection and security awareness training |
US10657248B2 (en) | 2017-07-31 | 2020-05-19 | KnowBe4, Inc. | Systems and methods for using attribute data for system protection and security awareness training |
US10986125B2 (en) | 2017-12-01 | 2021-04-20 | KnowBe4, Inc. | Systems and methods for AIDA based A/B testing |
US10313387B1 (en) * | 2017-12-01 | 2019-06-04 | KnowBe4, Inc. | Time based triggering of dynamic templates |
US10616275B2 (en) | 2017-12-01 | 2020-04-07 | KnowBe4, Inc. | Systems and methods for situational localization of AIDA |
US11799909B2 (en) | 2017-12-01 | 2023-10-24 | KnowBe4, Inc. | Systems and methods for situational localization of AIDA |
US10581910B2 (en) | 2017-12-01 | 2020-03-03 | KnowBe4, Inc. | Systems and methods for AIDA based A/B testing |
US10893071B2 (en) | 2017-12-01 | 2021-01-12 | KnowBe4, Inc. | Systems and methods for AIDA based grouping |
US11799906B2 (en) | 2017-12-01 | 2023-10-24 | KnowBe4, Inc. | Systems and methods for artificial intelligence driven agent campaign controller |
US10917433B2 (en) | 2017-12-01 | 2021-02-09 | KnowBe4, Inc. | Systems and methods for artificial model building techniques |
US10917434B1 (en) | 2017-12-01 | 2021-02-09 | KnowBe4, Inc. | Systems and methods for AIDA based second chance |
US10917432B2 (en) | 2017-12-01 | 2021-02-09 | KnowBe4, Inc. | Systems and methods for artificial intelligence driven agent campaign controller |
US10826938B2 (en) | 2017-12-01 | 2020-11-03 | KnowBe4, Inc. | Systems and methods for aida based role models |
US11777986B2 (en) | 2017-12-01 | 2023-10-03 | KnowBe4, Inc. | Systems and methods for AIDA based exploit selection |
US10673895B2 (en) | 2017-12-01 | 2020-06-02 | KnowBe4, Inc. | Systems and methods for AIDA based grouping |
US10348761B2 (en) | 2017-12-01 | 2019-07-09 | KnowBe4, Inc. | Systems and methods for situational localization of AIDA |
US11736523B2 (en) | 2017-12-01 | 2023-08-22 | KnowBe4, Inc. | Systems and methods for aida based A/B testing |
US11677784B2 (en) | 2017-12-01 | 2023-06-13 | KnowBe4, Inc. | Systems and methods for AIDA based role models |
US11048804B2 (en) | 2017-12-01 | 2021-06-29 | KnowBe4, Inc. | Systems and methods for AIDA campaign controller intelligent records |
US10348762B2 (en) | 2017-12-01 | 2019-07-09 | KnowBe4, Inc. | Systems and methods for serving module |
US10332003B1 (en) | 2017-12-01 | 2019-06-25 | KnowBe4, Inc. | Systems and methods for AIDA based analytics and reporting |
US10839083B2 (en) | 2017-12-01 | 2020-11-17 | KnowBe4, Inc. | Systems and methods for AIDA campaign controller intelligent records |
US11627159B2 (en) | 2017-12-01 | 2023-04-11 | KnowBe4, Inc. | Systems and methods for AIDA based grouping |
US10264018B1 (en) | 2017-12-01 | 2019-04-16 | KnowBe4, Inc. | Systems and methods for artificial model building techniques |
US10257225B1 (en) | 2017-12-01 | 2019-04-09 | KnowBe4, Inc. | Systems and methods for artificial intelligence driven agent campaign controller |
US10812527B2 (en) | 2017-12-01 | 2020-10-20 | KnowBe4, Inc. | Systems and methods for aida based second chance |
US11140199B2 (en) | 2017-12-01 | 2021-10-05 | KnowBe4, Inc. | Systems and methods for AIDA based role models |
US11206288B2 (en) | 2017-12-01 | 2021-12-21 | KnowBe4, Inc. | Systems and methods for AIDA based grouping |
US11212311B2 (en) | 2017-12-01 | 2021-12-28 | KnowBe4, Inc. | Time based triggering of dynamic templates |
US10812529B2 (en) | 2017-12-01 | 2020-10-20 | KnowBe4, Inc. | Systems and methods for AIDA based A/B testing |
US11876828B2 (en) | 2017-12-01 | 2024-01-16 | KnowBe4, Inc. | Time based triggering of dynamic templates |
US10715549B2 (en) | 2017-12-01 | 2020-07-14 | KnowBe4, Inc. | Systems and methods for AIDA based role models |
US10679164B2 (en) | 2017-12-01 | 2020-06-09 | KnowBe4, Inc. | Systems and methods for using artificial intelligence driven agent to automate assessment of organizational vulnerabilities |
US10681077B2 (en) | 2017-12-01 | 2020-06-09 | KnowBe4, Inc. | Time based triggering of dynamic templates |
US11297102B2 (en) | 2017-12-01 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for situational localization of AIDA |
US11552992B2 (en) | 2017-12-01 | 2023-01-10 | KnowBe4, Inc. | Systems and methods for artificial model building techniques |
US11494719B2 (en) | 2017-12-01 | 2022-11-08 | KnowBe4, Inc. | Systems and methods for using artificial intelligence driven agent to automate assessment of organizational vulnerabilities |
US10009375B1 (en) | 2017-12-01 | 2018-06-26 | KnowBe4, Inc. | Systems and methods for artificial model building techniques |
US11334673B2 (en) | 2017-12-01 | 2022-05-17 | KnowBe4, Inc. | Systems and methods for AIDA campaign controller intelligent records |
US10701106B2 (en) | 2018-03-20 | 2020-06-30 | KnowBe4, Inc. | System and methods for reverse vishing and point of failure remedial training |
US11457041B2 (en) | 2018-03-20 | 2022-09-27 | KnowBe4, Inc. | System and methods for reverse vishing and point of failure remedial training |
US10237302B1 (en) | 2018-03-20 | 2019-03-19 | KnowBe4, Inc. | System and methods for reverse vishing and point of failure remedial training |
US11108792B2 (en) | 2018-05-16 | 2021-08-31 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US11503050B2 (en) | 2018-05-16 | 2022-11-15 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US10868820B2 (en) | 2018-05-16 | 2020-12-15 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US10673876B2 (en) | 2018-05-16 | 2020-06-02 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US11349853B2 (en) | 2018-05-16 | 2022-05-31 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US11677767B2 (en) | 2018-05-16 | 2023-06-13 | KnowBe4, Inc. | Systems and methods for determining individual and group risk scores |
US11036848B2 (en) | 2018-09-19 | 2021-06-15 | KnowBe4, Inc. | System and methods for minimizing organization risk from users associated with a password breach |
US11640457B2 (en) | 2018-09-19 | 2023-05-02 | KnowBe4, Inc. | System and methods for minimizing organization risk from users associated with a password breach |
US10540493B1 (en) | 2018-09-19 | 2020-01-21 | KnowBe4, Inc. | System and methods for minimizing organization risk from users associated with a password breach |
US11316892B2 (en) | 2018-09-26 | 2022-04-26 | KnowBe4, Inc. | System and methods for spoofed domain identification and user training |
US10673894B2 (en) | 2018-09-26 | 2020-06-02 | KnowBe4, Inc. | System and methods for spoofed domain identification and user training |
US11902324B2 (en) | 2018-09-26 | 2024-02-13 | KnowBe4, Inc. | System and methods for spoofed domain identification and user training |
US11729203B2 (en) | 2018-11-02 | 2023-08-15 | KnowBe4, Inc. | System and methods of cybersecurity attack simulation for incident response training and awareness |
US10979448B2 (en) | 2018-11-02 | 2021-04-13 | KnowBe4, Inc. | Systems and methods of cybersecurity attack simulation for incident response training and awareness |
US11108791B2 (en) | 2018-12-15 | 2021-08-31 | KnowBe4, Inc. | System and methods for efficient combining of malware detection rules |
US10812507B2 (en) | 2018-12-15 | 2020-10-20 | KnowBe4, Inc. | System and methods for efficient combining of malware detection rules |
US11902302B2 (en) | 2018-12-15 | 2024-02-13 | KnowBe4, Inc. | Systems and methods for efficient combining of characteristc detection rules |
US11108821B2 (en) | 2019-05-01 | 2021-08-31 | KnowBe4, Inc. | Systems and methods for use of address fields in a simulated phishing attack |
US11729212B2 (en) | 2019-05-01 | 2023-08-15 | KnowBe4, Inc. | Systems and methods for use of address fields in a simulated phishing attack |
US11856025B2 (en) | 2019-09-10 | 2023-12-26 | KnowBe4, Inc. | Systems and methods for simulated phishing attacks involving message threads |
US11108822B2 (en) | 2019-09-10 | 2021-08-31 | KnowBe4, Inc. | Systems and methods for simulated phishing attacks involving message threads |
US11418541B2 (en) | 2019-09-10 | 2022-08-16 | KnowBe4, Inc. | Systems and methods for simulated phishing attacks involving message threads |
US11870806B1 (en) * | 2019-10-31 | 2024-01-09 | Rapid7, Inc. | Phishing attack training systems and methods |
US11269994B2 (en) | 2020-02-07 | 2022-03-08 | KnowBe4, Inc. | Systems and methods for providing configurable responses to threat identification |
US11381541B2 (en) | 2020-02-07 | 2022-07-05 | KnowBe4, Inc. | Systems and methods for communication with a third-party email client plug-in |
US11500984B2 (en) | 2020-02-07 | 2022-11-15 | KnowBe4, Inc. | Systems and methods for providing configurable responses to threat identification |
US11625689B2 (en) | 2020-04-02 | 2023-04-11 | KnowBe4, Inc. | Systems and methods for human resources applications of security awareness testing |
US11641375B2 (en) | 2020-04-29 | 2023-05-02 | KnowBe4, Inc. | Systems and methods for reporting based simulated phishing campaign |
US11936687B2 (en) | 2020-05-22 | 2024-03-19 | KnowBe4, Inc. | Systems and methods for end-user security awareness training for calendar-based threats |
US11336688B2 (en) | 2020-05-22 | 2022-05-17 | KnowBe4, Inc. | Systems and methods for end-user security awareness training for calendar-based threats |
US11902317B2 (en) | 2020-06-19 | 2024-02-13 | KnowBe4, Inc. | Systems and methods for determining a job score from a job title |
US10992699B1 (en) | 2020-06-19 | 2021-04-27 | KnowBe4, Inc. | Systems and methods for determining a job score from a job title |
US11297093B2 (en) | 2020-06-19 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for determining a job score from a job title |
US11496514B2 (en) | 2020-07-31 | 2022-11-08 | KnowBe4, Inc. | Systems and methods for security awareness using ad-based simulated phishing attacks |
US11729206B2 (en) | 2020-08-24 | 2023-08-15 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US10917429B1 (en) * | 2020-08-24 | 2021-02-09 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US11038914B1 (en) | 2020-08-24 | 2021-06-15 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US11552982B2 (en) * | 2020-08-24 | 2023-01-10 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US20220060495A1 (en) * | 2020-08-24 | 2022-02-24 | KnowBe4, Inc. | Systems and methods for effective delivery of simulated phishing campaigns |
US11356480B2 (en) | 2020-08-26 | 2022-06-07 | KnowBe4, Inc. | Systems and methods of simulated phishing campaign contextualization |
US11847579B2 (en) | 2020-08-28 | 2023-12-19 | KnowBe4, Inc. | Systems and methods for adaptation of SCORM packages at runtime with an extended LMS |
US10949760B1 (en) | 2020-08-28 | 2021-03-16 | KnowBe4, Inc. | Systems and methods for adaptation of SCORM packages at runtime with an extended LMS |
US11599810B2 (en) | 2020-08-28 | 2023-03-07 | KnowBe4, Inc. | Systems and methods for adaptation of SCORM packages at runtime with an extended LMS |
US11297095B1 (en) | 2020-10-30 | 2022-04-05 | KnowBe4, Inc. | Systems and methods for determination of level of security to apply to a group before display of user data |
US11503067B2 (en) | 2020-10-30 | 2022-11-15 | KnowBe4, Inc. | Systems and methods for determination of level of security to apply to a group before display of user data |
US11943253B2 (en) | 2020-10-30 | 2024-03-26 | KnowBe4, Inc. | Systems and methods for determination of level of security to apply to a group before display of user data |
US11552984B2 (en) | 2020-12-10 | 2023-01-10 | KnowBe4, Inc. | Systems and methods for improving assessment of security risk based on personal internet account data |
US11563767B1 (en) | 2021-09-02 | 2023-01-24 | KnowBe4, Inc. | Automated effective template generation |
US11997136B2 (en) | 2022-11-07 | 2024-05-28 | KnowBe4, Inc. | Systems and methods for security awareness using ad-based simulated phishing attacks |
Also Published As
Publication number | Publication date |
---|---|
US9262629B2 (en) | 2016-02-16 |
US8910287B1 (en) | 2014-12-09 |
US20150205953A1 (en) | 2015-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9262629B2 (en) | Methods and systems for preventing malicious use of phishing simulation records | |
Gupta et al. | Defending against phishing attacks: taxonomy of methods, current issues and future directions | |
Yang et al. | Use of phishing training to improve security warning compliance: Evidence from a field experiment | |
Chaudhry et al. | Phishing attacks and defenses | |
US9686297B2 (en) | Malicious message detection and processing | |
US11645943B2 (en) | Method and apparatus for training email recipients against phishing attacks using real threats in realtime | |
Al-Hamar et al. | Enterprise Credential Spear-phishing attack detection | |
US11563757B2 (en) | System and method for email account takeover detection and remediation utilizing AI models | |
US11665195B2 (en) | System and method for email account takeover detection and remediation utilizing anonymized datasets | |
Rutherford | The changing face of phishing | |
Alrabaee et al. | Boosting students and teachers cybersecurity awareness during COVID-19 pandemic | |
Kauer et al. | A comparison of American and German folk models of home computer security | |
Grobler et al. | Towards a cyber security aware rural community | |
Alghenaim et al. | Awareness of phishing attacks in the public sector: Review types and technical approaches | |
Bowles et al. | The first 10 years of the Trojan Horse defence | |
GB2550657A (en) | A method of protecting a user from messages with links to malicious websites | |
US11212245B1 (en) | Detection of forged e-mail messages at e-mail gateway | |
WO2016044065A1 (en) | Malicious message detection and processing | |
Baadel et al. | Avoiding the phishing bait: The need for conventional countermeasures for mobile users | |
Rowe et al. | Deception in cyber attacks | |
Singh et al. | Phishing: A computer security threat | |
KSHETRI et al. | A review and analysis of online crime in pre and post COVID scenario with respective counter measures and security strategies | |
Norris | Mitigating the effects of doxing | |
Shanker et al. | Cyber threat landscape in cyber space | |
Trivedi et al. | Analysis and impact of cyber threats on online social networks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SILICON VALLEY BANK, CALIFORNIA Free format text: SECURITY INTEREST;ASSIGNOR:PHISHME INC.;REEL/FRAME:040222/0684 Effective date: 20121015 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
AS | Assignment |
Owner name: ORIX GROWTH CAPITAL, LLC, TEXAS Free format text: SECURITY INTEREST;ASSIGNOR:COFENSE INC.;REEL/FRAME:050478/0889 Effective date: 20181227 |
|
AS | Assignment |
Owner name: COFENSE, INC., VIRGINIA Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:SILICON VALLEY BANK;REEL/FRAME:050616/0262 Effective date: 20190930 |