US20230164166A1 - Systems and methods for effective delivery of simulated phishing campaigns - Google Patents

Systems and methods for effective delivery of simulated phishing campaigns Download PDF

Info

Publication number
US20230164166A1
US20230164166A1 US18/094,628 US202318094628A US2023164166A1 US 20230164166 A1 US20230164166 A1 US 20230164166A1 US 202318094628 A US202318094628 A US 202318094628A US 2023164166 A1 US2023164166 A1 US 2023164166A1
Authority
US
United States
Prior art keywords
user
personal
risk score
personal information
organization
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US18/094,628
Inventor
Mark William Patton
Daniel Cormier
Greg Kras
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Knowbe4 Inc
Original Assignee
Knowbe4 Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Knowbe4 Inc filed Critical Knowbe4 Inc
Priority to US18/094,628 priority Critical patent/US20230164166A1/en
Assigned to KnowBe4, Inc. reassignment KnowBe4, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRAS, GREG
Publication of US20230164166A1 publication Critical patent/US20230164166A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1433Vulnerability analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1441Countermeasures against malicious traffic
    • H04L63/1483Countermeasures against malicious traffic service impersonation, e.g. phishing, pharming or web spoofing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]

Definitions

  • the present disclosure relates to systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data.
  • Cybersecurity incidents such as phishing attacks may cost organizations in terms of the loss of confidential and/or important information and expense in mitigating losses due to breaches of confidential information. Such incidents can also cause customers to lose trust in the organizations.
  • the incidents of cybersecurity attacks and the costs of mitigating the damage caused are increasing every year.
  • Organizations invest in cybersecurity tools such as antivirus, anti-ransomware, anti-phishing, and other platforms. Such cybersecurity tools may detect and intercept known cybersecurity attacks.
  • social engineering attacks or new threats may not be readily detectable by such tools, and the organizations may have to rely on their employees to recognize such threats.
  • a social engineering attack is an attack that exploits human behavior to gain access to an organization's systems through attack vectors such as phishing emails.
  • a phishing email may include content to be presented to a user, where the content is chosen to convince the user that the phishing email is genuine and that the user should interact with it.
  • cybersecurity attacks organizations have recognized phishing attacks and social engineering attacks as one of the most prominent threats that can cause serious data breaches, including confidential information such as intellectual property, financial information, organizational information, and other important information.
  • Attackers who launch phishing attacks and social engineering attacks may attempt to evade an organization's security apparatuses and tools and target its employees.
  • the organizations may conduct security awareness training programs for their employees, along with other security measures. Through security awareness training programs, the organizations actively educate their employees on how to spot and report a suspected phishing attack.
  • These organizations may operate the security awareness training programs through their in-house cybersecurity teams or may utilize third-party entities who are experts in cybersecurity matters to conduct such training.
  • the employees may follow best practices of cybersecurity hygiene and comply with security regulations while working in offices. At times, for example, while working remotely, the employees may not follow the best practices of cybersecurity hygiene and may not comply with security regulations, partly due to the fact that the employees do not feel watched.
  • An organization's security may also be affected by its employees' behavior outside the organization.
  • the employees' behavior when in their personal domain may directly or indirectly jeopardize the organization's security. Examples of such behaviors include using the organization's email address for personal purposes and password reuse between work and personal accounts.
  • the present disclosure generally relates to systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data or other personal information.
  • a method of using personal information of a user for determining a personal risk score of the user of an organization includes receiving, by a security awareness system configured on one or more servers, personal information of a user or registration of personal information of a user of an organization, performing, by the security awareness system, at least one of an exposure check or a security audit of the personal information of the user, and adjusting, by the security awareness system, a personal risk score of the user based at least on a result of one of the exposure check or the security audit.
  • the method includes, verifying, by the security awareness system, an email address identified by the personal information as used in a personal domain of the user.
  • the method includes, storing, by the security awareness system, the email address used in the personal domain of the user in association with a profile of the user for the security awareness system.
  • the method includes registering the personal information with the security awareness system in response to the email address used in the personal domain of the user being verified.
  • the method includes storing, by the security awareness system, the personal information in an obfuscated form.
  • the method includes performing an exposure check by searching using at least one of an email address or a username in the personal information for breached user information in one or more breach databases.
  • the method includes performing a security audit by assessing a strength of one or more registered personal passwords from the personal information and compliance to password requirements of the organization.
  • the method includes adjusting the personal risk score of the user based on at least the user's registration of the personal information with the security awareness system or based on the security awareness system receiving the user's personal information.
  • the method includes determining by the security awareness system a risk score based at least on the personal risk score of the user.
  • the method includes performing by the security awareness system, based on at least the personal risk score of the user, one of a remedial training or a simulated phishing campaign directed to the user.
  • the security awareness system is configured to determine when a user has registered for a web site in a personal domain using his or her organization login credentials based on monitoring a mailbox of the user. In an implementation, the security awareness system is configured to determine whether the user has registered for the website using current organization login credentials or previous organization login credentials. The security awareness system provides training to the user about a safe use of the organization login credentials.
  • FIG. 1 A is a block diagram depicting an embodiment of a network environment comprising client devices in communication with server devices, according to some embodiments;
  • FIG. 1 B is a block diagram depicting a cloud computing environment comprising client devices in communication with cloud service providers, according to some embodiments;
  • FIGS. 1 C and 1 D are block diagrams depicting embodiments of computing devices useful in connection with the methods and systems described herein, according to some embodiments;
  • FIG. 2 depicts an implementation of some of an architecture of a system for determining a personal risk score of a user of an organization based on personal information of the user, according to some embodiments;
  • FIG. 3 depicts a flowchart for detecting that the user has registered for a personal domain website using an organization email address, according to some embodiments
  • FIG. 4 depicts a flowchart for using personal information for determining the personal risk score of the user of the organization, according to some embodiments.
  • FIG. 5 depicts a flowchart for performing a remedial training or a simulated phishing campaign directed to the user based on the personal risk score of the user, according to some embodiments.
  • Section A describes a network environment and computing environment which may be useful for practicing embodiments described herein.
  • Section B describes embodiments of systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data.
  • FIG. 1 A an embodiment of a network environment is depicted.
  • the network environment includes one or more clients 102 a - 102 n (also generally referred to as local machines(s) 102 , client(s) 102 , client node(s) 102 , client machine(s) 102 , client computer(s) 102 , client device(s) 102 , endpoint(s) 102 , or endpoint node(s) 102 ) in communication with one or more servers 106 a - 106 n (also generally referred to as server(s) 106 , node(s) 106 , machine(s) 106 , or remote machine(s) 106 ) via one or more networks 104 .
  • client 102 has the capacity to function as both a client node seeking access to resources provided by a server and as a server providing access to hosted resources for other clients 102 a - 102 n.
  • FIG. 1 A shows a network 104 between clients 102 and the servers 106
  • clients 102 and servers 106 may be on the same network 104 .
  • network 104 ′ (not shown) may be a private network and a network 104 may be a public network.
  • network 104 may be a private network and a network 104 ′ may be a public network.
  • networks 104 and 104 ′ may both be private networks.
  • Wired links may include Digital Subscriber Line (DSL), coaxial cable lines, or optical fiber lines.
  • Wireless links may include Bluetooth®, Bluetooth Low Energy (BLE), ANT/ANT+, ZigBee, Z-Wave, Thread, Wi-Fi®, Worldwide Interoperability for Microwave Access (WiMAX®), mobile WiMAX®, WiMAX®-Advanced, NFC, SigFox, LoRa, Random Phase Multiple Access (RPMA), Weightless-N/P/W, an infrared channel, or a satellite band.
  • the wireless links may also include any cellular network standards to communicate among mobile devices, including standards that qualify as 1G, 2G, 3G, 4G, or 5G.
  • the network standards may qualify as one or more generations of mobile telecommunication standards by fulfilling a specification or standards such as the specifications maintained by the International Telecommunication Union.
  • the 3G standards may correspond to the International Mobile Telecommuniations-2000 (IMT-2000) specification
  • the 4G standards may correspond to the International Mobile Telecommunication Advanced (IMT-Advanced) specification.
  • Examples of cellular network standards include AMPS, GSM, GPRS, UMTS, CDMA2000, CDMA-1xRTT, CDMA-EVDO, LTE, LTE-Advanced, LTE-M1, and Narrowband IoT (NB-IoT).
  • Wireless standards may use various channel access methods, e.g. FDMA, TDMA, CDMA, or SDMA.
  • different types of data may be transmitted via different links and standards.
  • the same types of data may be transmitted via different links and standards.
  • Network 104 may be any type and/or form of network.
  • the geographical scope of the network may vary widely and network 104 can be a body area network (BAN), a personal area network (PAN), a local-area network (LAN), e.g. Intranet, a metropolitan area network (MAN), a wide area network (WAN), or the Internet.
  • the topology of network 104 may be of any form and may include, e.g., any of the following: point-to-point, bus, star, ring, mesh, or tree.
  • Network 104 may be an overlay network which is virtual and sits on top of one or more layers of other networks 104 ′.
  • Network 104 may be of any such network topology as known to those ordinarily skilled in the art capable of supporting the operations described herein.
  • Network 104 may utilize different techniques and layers or stacks of protocols, including, e.g., the Ethernet protocol, the internet protocol suite (TCP/IP), the ATM (Asynchronous Transfer Mode) technique, the SONET (Synchronous Optical Networking) protocol, or the SDH (Synchronous Digital Hierarchy) protocol.
  • the TCP/IP internet protocol suite may include application layer, transport layer, internet layer (including, e.g., IPv4 and Ipv6), or the link layer.
  • Network 104 may be a type of broadcast network, a telecommunications network, a data communication network, or a computer network.
  • the system may include multiple, logically grouped servers 106 .
  • the logical group of servers may be referred to as a server farm, a server cluster, or a machine farm.
  • servers 106 may be geographically dispersed.
  • a machine farm may be administered as a single entity.
  • the machine farm includes a plurality of machine farms.
  • Servers 106 within each machine farm can be heterogeneous—one or more of servers 106 or machines 106 can operate according to one type of operating system platform (e.g., Windows, manufactured by Microsoft Corp. of Redmond, Wash.), while one or more of the other servers 106 can operate according to another type of operating system platform (e.g., Unix, Linux, or Mac OSX).
  • operating system platform e.g., Windows, manufactured by Microsoft Corp. of Redmond, Wash.
  • servers 106 in the machine farm may be stored in high-density rack systems, along with associated storage systems, and located in an enterprise data center.
  • consolidating servers 106 in this way may improve system manageability, data security, the physical security of the system, and system performance by locating servers 106 and high-performance storage systems on localized high-performance networks.
  • Centralizing servers 106 and storage systems and coupling them with advanced system management tools allows more efficient use of server resources.
  • Servers 106 of each machine farm do not need to be physically proximate to another server 106 in the same machine farm.
  • the group of servers 106 logically grouped as a machine farm may be interconnected using a wide-area network (WAN) connection or a metropolitan-area network (MAN) connection.
  • WAN wide-area network
  • MAN metropolitan-area network
  • a machine farm may include servers 106 physically located in different continents or different regions of a continent, country, state, city, campus, or room. Data transmission speeds between servers 106 in the machine farm can be increased if servers 106 are connected using a local-area network (LAN) connection or some form of direct connection.
  • LAN local-area network
  • a heterogeneous machine farm may include one or more servers 106 operating according to a type of operating system, while one or more other servers execute one or more types of hypervisors rather than operating systems.
  • hypervisors may be used to emulate virtual hardware, partition physical hardware, virtualize physical hardware, and execute virtual machines that provide access to computing environments, allowing multiple operating systems to run concurrently on a host computer.
  • Native hypervisors may run directly on the host computer.
  • Hypervisors may include VMware ESX/ESXi, manufactured by VMWare, Inc., of Palo Alta, Calif.; the Xen hypervisor, an open source product whose development is overseen by Citrix Systems, Inc. of Fort Lauderdale, Florida; the HYPER-V hypervisors provided by Microsoft, or others.
  • Hosted hypervisors may run within an operating system on a second software level. Examples of hosted hypervisors may include VMWare Workstation and VirtualBox, manufactured by Oracle Corporation of Redwood City, Calif. Additional layers of abstraction may include Container Virtualization and Management infrastructure. Container Virtualization isolates execution of a service to the container while relaying instructions to the machine through one operating system layer per host machine. Container infrastructure may include Docker, an open source product whose development is overseen by Docker, Inc. of San Francisco, Calif.
  • Management of the machine farm may be de-centralized.
  • one or more servers 106 may comprise components, subsystems, and modules to support one or more management services for the machine farm.
  • one or more servers 106 provide functionality for management of dynamic data, including techniques for handling failover, data replication, and increasing the robustness of the machine farm.
  • Each server 106 may communicate with a persistent store and, in some embodiments, with a dynamic store.
  • Server 106 may be a file server, application server, web server, proxy server, appliance, network appliance, gateway, gateway server, virtualization server, deployment server, SSL VPN server, or security system. In one embodiment, a plurality of servers 106 may be in the path between any two communicating servers 106 .
  • a cloud computing environment may provide client 102 with one or more resources provided by a network environment.
  • the cloud computing environment may include one or more clients 102 a - 102 n, in communication with cloud 108 over one or more networks 104 .
  • Clients 102 may include, e.g., thick clients, thin clients, and zero clients.
  • a thick client may provide at least some functionality even when disconnected from cloud 108 or servers 106 .
  • a thin client or zero client may depend on the connection to cloud 108 or server 106 to provide functionality.
  • a zero client may depend on cloud 108 or other networks 104 or servers 106 to retrieve operating system data for the client device 102 .
  • Cloud 108 may include back end platforms, e.g., servers 106 , storage, server farms or data centers.
  • Cloud 108 may be public, private, or hybrid.
  • Public clouds may include public servers 106 that are maintained by third parties to clients 102 or the owners of the clients. Servers 106 may be located off-site in remote geographical locations as disclosed above or otherwise.
  • Public clouds may be connected to servers 106 over a public network.
  • Private clouds may include private servers 106 that are physically maintained by clients 102 or owners of clients.
  • Private clouds may be connected to servers 106 over a private network 104 .
  • Hybrid clouds 109 may include both the private and public networks 104 and servers 106 .
  • Cloud 108 may also include a cloud-based delivery, e.g. Software as a Service (SaaS) 110 , Platform as a Service (PaaS) 112 , and Infrastructure as a Service (IaaS) 114 .
  • SaaS Software as a Service
  • PaaS Platform as a Service
  • IaaS Infrastructure as a Service
  • IaaS may refer to a user renting the user of infrastructure resources that are needed during a specified time period.
  • IaaS provides may offer storage, networking, servers, or virtualization resources from large pools, allowing the users to quickly scale up by accessing more resources as needed. Examples of IaaS include Amazon Web Services (AWS) provided by Amazon, Inc. of Seattle, Wash., Rackspace Cloud provided by Rackspace Inc. of San Antonio, Texas, Google Compute Engine provided by Google Inc.
  • AWS Amazon Web Services
  • Azure Amazon, Inc. of Seattle, Wash.
  • Rackspace Cloud provided by Rackspace Inc. of San Antonio, Texas
  • Google Compute Engine provided by
  • PaaS providers may offer functionality provided by IaaS, including, e.g., storage, networking, servers, virtualization, or containerization, as well as additional resources, e.g., the operating system, middleware, or runtime resources. Examples of PaaS include Windows Azure provided by Microsoft Corporation of Redmond, Wash., Google App Engine provided by Google Inc., and Heroku provided by Heroku, Inc. of San Francisco Calif. SaaS providers may offer the resources that PaaS provides, including storage, networking, servers, virtualization, operating system, middleware, or runtime resources. In some embodiments, SaaS providers may offer additional resources including, e.g., data and application resources.
  • SaaS examples include Google Apps provided by Google Inc., Salesforce provided by Salesforce.com Inc. of San Francisco, Calif., or Office365 provided by Microsoft Corporation. Examples of SaaS may also include storage providers, e.g. Dropbox provided by Dropbox Inc. of San Francisco, Calif., Microsoft OneDrive provided by Microsoft Corporation, Google Drive provided by Google Inc., or Apple iCloud provided by Apple Inc. of Cupertino, Calif.
  • Clients 102 may access IaaS resources with one or more IaaS standards, including, e.g., Amazon Elastic Compute Cloud (EC2), Open Cloud Computing Interface (OCCI), Cloud Infrastructure Management Interface (CIMI), or OpenStack standards.
  • IaaS standards may allow clients access to resources over a Hypertext Transfer Protocol (HTTP) and may use Representational State Transfer (REST) protocol or Simple Object Access Protocol (SOAP).
  • HTTP Hypertext Transfer Protocol
  • REST Representational State Transfer
  • SOAP Simple Object Access Protocol
  • Clients 102 may access PaaS resources with different PaaS interfaces.
  • Some PaaS interfaces use HTTP packages, standard Java APIs, JavaMail API, Java Data Objects (JDO), Java Persistence API (JPA), Python APIs, web integration APIs for different programming languages including, e.g., Rack for Ruby, WSGI for Python, or PSGI for Perl, or other APIs that may be built on REST, HTTP, XML, or other protocols.
  • Clients 102 may access SaaS resources using web-based user interfaces, provided by a web browser (e.g. Google Chrome, Microsoft Internet Explorer, or Mozilla Firefox provided by Mozilla Foundation of Mountain View, Calif.).
  • Clients 102 may also access SaaS resources through smartphone or tablet applications, including e.g., Salesforce Sales Cloud, or Google Drive App.
  • Clients 102 may also access SaaS resources through the client operating system, including e.g. Windows file system for Dropbox.
  • access to IaaS, PaaS, or SaaS resources may be authenticated.
  • a server or authentication server may authenticate a user via security certificates, HTTPS, or API keys.
  • API keys may include various encryption standards such as, e.g., Advanced Encryption Standard (AES).
  • Data resources may be sent over Transport Layer Security (TLS) or Secure Sockets Layer (SSL).
  • TLS Transport Layer Security
  • SSL Secure Sockets Layer
  • Client 102 and server 106 may be deployed as and/or executed on any type and form of computing device, e.g., a computer, network device or appliance capable of communicating on any type and form of network and performing the operations described herein.
  • computing device e.g., a computer, network device or appliance capable of communicating on any type and form of network and performing the operations described herein.
  • FIGS. 1 C and 1 D depict block diagrams of a computing device 100 useful for practicing an embodiment of client 102 or server 106 .
  • each computing device 100 includes central processing unit 121 , and main memory unit 122 .
  • computing device 100 may include storage device 128 , installation device 116 , network interface 118 , and I/O controller 123 , display devices 124 a - 124 n, keyboard 126 and pointing device 127 , e.g., a mouse.
  • Storage device 128 may include, without limitation, operating system 129 , software 131 , and a software of security awareness system 120 . As shown in FIG.
  • each computing device 100 may also include additional optional elements, e.g., a memory port 103 , bridge 170 , one or more input/output devices 130 a - 130 n (generally referred to using reference numeral 130 ), and cache memory 140 in communication with central processing unit 121 .
  • additional optional elements e.g., a memory port 103 , bridge 170 , one or more input/output devices 130 a - 130 n (generally referred to using reference numeral 130 ), and cache memory 140 in communication with central processing unit 121 .
  • Central processing unit 121 is any logic circuity that responds to and processes instructions fetched from main memory unit 122 .
  • central processing unit 121 is provided by a microprocessor unit, e.g.: those manufactured by Intel Corporation of Mountain View, Calif.; those manufactured by Motorola Corporation of Schaumburg, Illinois; the ARM processor and TEGRA system on a chip (SoC) manufactured by Nvidia of Santa Clara, Calif.; the POWER7 processor, those manufactured by International Business Machines of White Plains, N.Y.; or those manufactured by Advanced Micro Devices of Sunnyvale, Calif.
  • Computing device 100 may be based on any of these processors, or any other processor capable of operating as described herein.
  • Central processing unit 121 may utilize instruction level parallelism, thread level parallelism, different levels of cache, and multi-core processors.
  • a multi-core processor may include two or more processing units on a single computing component. Examples of multi-core processors include the AMD PHENOM IIX2, INTEL CORE i5 and INTEL CORE i7.
  • Main memory unit 122 may include one or more memory chips capable of storing data and allowing any storage location to be directly accessed by microprocessor 121 .
  • Main memory unit 122 may be volatile and faster than storage 128 memory.
  • Main memory units 122 may be Dynamic Random-Access Memory (DRAM) or any variants, including static Random-Access Memory (SRAM), Burst SRAM or SynchBurst SRAM (BSRAM), Fast Page Mode DRAM (FPM DRAM), Enhanced DRAM (EDRAM), Extended Data Output RAM (EDO RAM), Extended Data Output DRAM (EDO DRAM), Burst Extended Data Output DRAM (BEDO DRAM), Single Data Rate Synchronous DRAM (SDR SDRAM), Double Data Rate SDRAM (DDR SDRAM), Direct Rambus DRAM (DRDRAM), or Extreme Data Rate DRAM (XDR DRAM).
  • DRAM Dynamic Random-Access Memory
  • SRAM static Random-Access Memory
  • BSRAM Burst SRAM or SynchBurst SRAM
  • FPM DRAM Fast Page
  • main memory 122 or storage 128 may be non-volatile; e.g., non-volatile read access memory (NVRAM), flash memory non-volatile static RAM (nvSRAM), Ferroelectric RAM (FeRAM), Magnetoresistive RAM (MRAM), Phase-change memory (PRAM), conductive-bridging RAM (CBRAM), Silicon-Oxide-Nitride-Oxide-Silicon (SONOS), Resistive RAM (RRAM), Racetrack, Nano-RAM (NRAM), or Millipede memory.
  • NVRAM non-volatile read access memory
  • nvSRAM flash memory non-volatile static RAM
  • FeRAM Ferroelectric RAM
  • MRAM Magnetoresistive RAM
  • PRAM Phase-change memory
  • CBRAM conductive-bridging RAM
  • Silicon-Oxide-Nitride-Oxide-Silicon SONOS
  • Resistive RAM RRAM
  • Racetrack Nano-RAM
  • Millipede memory Millipede memory
  • FIG. 1 D depicts an embodiment of computing device 100 in which the processor communicates directly with main memory 122 via memory port 103 .
  • main memory 122 may be DRDRAM.
  • FIG. 1 D depicts an embodiment in which the main processor 121 communicates directly with cache memory 140 via a secondary bus, sometimes referred to as a backside bus.
  • main processor 121 communicates with cache memory 140 using system bus 150 .
  • Cache memory 140 typically has a faster response time than main memory 122 and is typically provided by SRAM, BSRAM, or EDRAM.
  • the processor 121 communicates with various I/O devices 130 via local system bus 150 .
  • Various buses may be used to connect central processing unit 121 to any of I/O devices 130 , including a PCI bus, a PCI-X bus, or a PCI-Express bus, or a NuBus.
  • FIG. 1 D depicts an embodiment of computer 100 in which main processor 121 communicates directly with I/O device 130 b or other processors 121 ′ via HYPERTRANSPORT, RAPIDIO, or INFINIBAND communications technology.
  • FIG. 1 D also depicts an embodiment in which local busses and direct communication are mixed: the processor 121 communicates with I/O device 130 a using a local interconnect bus while communicating with I/O device 130 b directly.
  • I/O devices 130 a - 130 n may be present in computing device 100 .
  • Input devices may include keyboards, mice, trackpads, trackballs, touchpads, touch mice, multi-touch touchpads and touch mice, microphones, multi-array microphones, drawing tablets, cameras, single-lens reflex cameras (SLR), digital SLR (DSLR), CMOS sensors, accelerometers, infrared optical sensors, pressure sensors, magnetometer sensors, angular rate sensors, depth sensors, proximity sensors, ambient light sensors, gyroscopic sensors, or other sensors.
  • Output devices may include video displays, graphical displays, speakers, headphones, inkjet printers, laser printers, and 3D printers.
  • Devices 130 a - 130 n may include a combination of multiple input or output devices, including, e.g., Microsoft KINECT, Nintendo Wiimote for the WII, Nintendo WII U GAMEPAD, or Apple iPhone. Some devices 130 a - 130 n allow gesture recognition inputs through combining some of the inputs and outputs. Some devices 130 a - 130 n provide for facial recognition which may be utilized as an input for different purposes including authentication and other commands. Some devices 130 a - 130 n provide for voice recognition and inputs, including, e.g., Microsoft KINECT, SIRI for iPhone by Apple, Google Now or Google Voice Search, and Alexa by Amazon.
  • Additional devices 130 a - 130 n have both input and output capabilities, including, e.g., haptic feedback devices, touchscreen displays, or multi-touch displays.
  • Touchscreen, multi-touch displays, touchpads, touch mice, or other touch sensing devices may use different technologies to sense touch, including, e.g., capacitive, surface capacitive, projected capacitive touch (PCT), in cell capacitive, resistive, infrared, waveguide, dispersive signal touch (DST), in-cell optical, surface acoustic wave (SAW), bending wave touch (BWT), or force-based sensing technologies.
  • PCT surface capacitive, projected capacitive touch
  • DST dispersive signal touch
  • SAW surface acoustic wave
  • BWT bending wave touch
  • Some multi-touch devices may allow two or more contact points with the surface, allowing advanced functionality including, e.g., pinch, spread, rotate, scroll, or other gestures.
  • Some touchscreen devices including, e.g., Microsoft PIXELSENSE or Multi-Touch Collaboration Wall, may have larger surfaces, such as on a table-top or on a wall, and may also interact with other electronic devices.
  • Some I/O devices 130 a - 130 n, display devices 124 a - 124 n or group of devices may be augmented reality devices. The I/O devices may be controlled by I/O controller 123 as shown in FIG. 1 C .
  • the I/O controller may control one or more I/O devices, such as, e.g., keyboard 126 and pointing device 127 , e.g., a mouse or optical pen. Furthermore, an I/O device may also provide storage and/or installation medium 116 for computing device 100 . In still other embodiments, computing device 100 may provide USB connections (not shown) to receive handheld USB storage devices. In further embodiments, a I/O device 130 may be a bridge between the system bus 150 and an external communication bus, e.g. a USB bus, a SCSI bus, a FireWire bus, an Ethernet bus, a Gigabit Ethernet bus, a Fiber Channel bus, or a Thunderbolt bus.
  • an external communication bus e.g. a USB bus, a SCSI bus, a FireWire bus, an Ethernet bus, a Gigabit Ethernet bus, a Fiber Channel bus, or a Thunderbolt bus.
  • Display devices 124 a - 124 n may be connected to I/O controller 123 .
  • Display devices may include, e.g., liquid crystal displays (LCD), thin film transistor LCD (TFT-LCD), blue phase LCD, electronic papers (e-ink) displays, flexile displays, light emitting diode displays (LED), digital light processing (DLP) displays, liquid crystal on silicon (LCOS) displays, organic light-emitting diode (OLED) displays, active-matrix organic light-emitting diode (AMOLED) displays, liquid crystal laser displays, time-multiplexed optical shutter (TMOS) displays, or 3D displays. Examples of 3D displays may use, e.g.
  • Display devices 124 a - 124 n may also be a head-mounted display (HMD). In some embodiments, display devices 124 a - 124 n or the corresponding I/O controllers 123 may be controlled through or have hardware support for OPENGL or DIRECTX API or other graphics libraries.
  • HMD head-mounted display
  • computing device 100 may include or connect to multiple display devices 124 a - 124 n, which each may be of the same or different type and/or form.
  • any of I/O devices 130 a - 130 n and/or the I/O controller 123 may include any type and/or form of suitable hardware, software, or combination of hardware and software to support, enable or provide for the connection and use of multiple display devices 124 a - 124 n by computing device 100 .
  • computing device 100 may include any type and/or form of video adapter, video card, driver, and/or library to interface, communicate, connect, or otherwise use display devices 124 a - 124 n.
  • a video adapter may include multiple connectors to interface to multiple display devices 124 a - 124 n.
  • computing device 100 may include multiple video adapters, with each video adapter connected to one or more of display devices 124 a - 124 n.
  • any portion of the operating system of computing device 100 may be configured for using multiple displays 124 a - 124 n.
  • one or more of the display devices 124 a - 124 n may be provided by one or more other computing devices 100 a or 100 b connected to computing device 100 , via network 104 .
  • software may be designed and constructed to use another computer's display device as second display device 124 a for computing device 100 .
  • an Apple iPad may connect to computing device 100 and use the display of the device 100 as an additional display screen that may be used as an extended desktop.
  • computing device 100 may be configured to have multiple display devices 124 a - 124 n.
  • computing device 100 may comprise storage device 128 (e.g. one or more hard disk drives or redundant arrays of independent disks) for storing an operating system or other related software, and for storing application software programs such as any program related to security awareness system 120 .
  • storage device 128 include, e.g., hard disk drive (HDD); optical drive including CD drive, DVD drive, or BLU-RAY drive; solid-state drive (SSD); USB flash drive; or any other device suitable for storing data.
  • Some storage devices may include multiple volatile and non-volatile memories, including, e.g., solid state hybrid drives that combine hard disks with solid state cache.
  • Some storage devices 128 may be non-volatile, mutable, or read-only.
  • Some storage devices 128 may be internal and connect to computing device 100 via bus 150 . Some storage devices 128 may be external and connect to computing device 100 via a I/O device 130 that provides an external bus. Some storage devices 128 may connect to computing device 100 via network interface 118 over network 104 , including, e.g., the Remote Disk for MACBOOK AIR by Apple. Some client devices 100 may not require a non-volatile storage device 128 and may be thin clients or zero clients 102 . Some storage devices 128 may also be used as an installation device 116 and may be suitable for installing software and programs. Additionally, the operating system and the software can be run from a bootable medium, for example, a bootable CD, e.g. KNOPPIX, a bootable CD for GNU/Linux that is available as a GNU/Linux distribution from knoppix.net.
  • a bootable CD e.g. KNOPPIX
  • a bootable CD for GNU/Linux that is available as a G
  • Computing device 100 may also install software or application from an application distribution platform.
  • application distribution platforms include the App Store for iOS provided by Apple, Inc., the Mac App Store provided by Apple, Inc., GOOGLE PLAY for Android OS provided by Google Inc., Chrome Webstore for CHROME OS provided by Google Inc., and Amazon Appstore for Android OS and KINDLE FIRE provided by Amazon.com, Inc.
  • An application distribution platform may facilitate installation of software on client device 102 .
  • An application distribution platform may include a repository of applications on server 106 or cloud 108 , which clients 102 a - 102 n may access over a network 104 .
  • An application distribution platform may include application developed and provided by various developers. A user of client device 102 may select, purchase and/or download an application via the application distribution platform.
  • computing device 100 may include a network interface 118 to interface to network 104 through a variety of connections including, but not limited to, standard telephone lines LAN or WAN links (e.g., 802.11, T1, T3, Gigabit Ethernet, InfiniBand), broadband connections (e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET, ADSL, VDSL, BPON, GPON, fiber optical including FiOS), wireless connections, or some combination of any or all of the above.
  • standard telephone lines LAN or WAN links e.g., 802.11, T1, T3, Gigabit Ethernet, InfiniBand
  • broadband connections e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET, ADSL, VDSL, BPON, GPON, fiber optical including FiOS
  • Connections can be established using a variety of communication protocols (e.g., TCP/IP, Ethernet, ARCNET, SONET, SDH, Fiber Distributed Data Interface (FDDI), IEEE 802.11a/b/g/n/ac CDMA, GSM, WiMAX, and direct asynchronous connections).
  • computing device 100 communicates with other computing devices 100 ′ via any type and/or form of gateway or tunneling protocol e.g. Secure Socket Layer (SSL) or Transport Layer Security (TLS), or the Citrix Gateway Protocol manufactured by Citrix Systems, Inc.
  • SSL Secure Socket Layer
  • TLS Transport Layer Security
  • Citrix Gateway Protocol manufactured by Citrix Systems, Inc.
  • Network interface 118 may comprise a built-in network adapter, network interface card, PCMCIA network card, EXPRESSCARD network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing computing device 100 to any type of network capable of communication and performing the operations described herein.
  • Computing device 100 of the sort depicted in FIGS. 1 B and 1 C may operate under the control of an operating system, which controls scheduling of tasks and access to system resources.
  • Computing device 100 can be running any operating system such as any of the versions of the MICROSOFT WINDOWS operating systems, the different releases of the Unix and Linux operating systems, any version of the MAC OS for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein.
  • any operating system such as any of the versions of the MICROSOFT WINDOWS operating systems, the different releases of the Unix and Linux operating systems, any version of the MAC OS for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein.
  • Typical operating systems include, but are not limited to: WINDOWS 2000, WINDOWS Server 2012, WINDOWS CE, WINDOWS Phone, WINDOWS XP, WINDOWS VISTA, and WINDOWS 7, WINDOWS RT, WINDOWS 8 and WINDOW 10, all of which are manufactured by Microsoft Corporation of Redmond, Wash.; MAC OS and iOS, manufactured by Apple, Inc.; and Linux, a freely-available operating system, e.g. Linux Mint distribution (“distro”) or Ubuntu, distributed by Canonical Ltd. of London, United Kingdom; or Unix or other Unix-like derivative operating systems; and Android, designed by Google Inc., among others.
  • Some operating systems including, e.g., the CHROME OS by Google Inc., may be used on zero clients or thin clients, including, e.g., CHROMEBOOKS.
  • Computer system 100 can be any workstation, telephone, desktop computer, laptop or notebook computer, netbook, ULTRABOOK, tablet, server, handheld computer, mobile telephone, smartphone or other portable telecommunications device, media playing device, a gaming system, mobile computing device, or any other type and/or form of computing, telecommunications or media device that is capable of communication.
  • Computer system 100 has sufficient processor power and memory capacity to perform the operations described herein.
  • computing device 100 may have different processors, operating systems, and input devices consistent with the device.
  • the Samsung GALAXY smartphones e.g., operate under the control of Android operating system developed by Google, Inc. GALAXY smartphones receive input via a touch interface.
  • computing device 100 is a gaming system.
  • the computer system 100 may comprise a PLAYSTATION 3, or PERSONAL PLAYSTATION PORTABLE (PSP), PLAYSTATION VITA, PLAYSTATION 4, or a PLAYSTATION 4 PRO device manufactured by the Sony Corporation of Tokyo, Japan, or a NINTENDO DS, NINTENDO 3DS, NINTENDO WII, NINTENDO WII U, or a NINTENDO SWITCH device manufactured by Nintendo Co., Ltd., of Kyoto, Japan, or an XBOX 360 device manufactured by Microsoft Corporation.
  • computing device 100 is a digital audio player such as the Apple IPOD, IPOD Touch, and IPOD NANO lines of devices, manufactured by Apple Computer of Cupertino, Calif.
  • Some digital audio players may have other functionality, including, e.g., a gaming system or any functionality made available by an application from a digital application distribution platform.
  • the IPOD Touch may access the Apple App Store.
  • computing device 100 is a portable media player or digital audio player supporting file formats including, but not limited to, MP3, WAV, M4A/AAC, WMA Protected AAC, AIFF, Audible audiobook, Apple Lossless audio file formats and .mov, .m4v, and .mp4 MPEG-4 (H.264/MPEG-4 AVC) video file formats.
  • file formats including, but not limited to, MP3, WAV, M4A/AAC, WMA Protected AAC, AIFF, Audible audiobook, Apple Lossless audio file formats and .mov, .m4v, and .mp4 MPEG-4 (H.264/MPEG-4 AVC) video file formats.
  • computing device 100 is a tablet e.g. the IPAD line of devices by Apple; GALAXY TAB family of devices by Samsung; or KINDLE FIRE, byAmazon.com, Inc. of Seattle, Wash.
  • computing device 100 is an eBook reader, e.g. the KINDLE family of devices by Amazon.com, or NOOK family of devices by Barnes & Noble, Inc. of New York City, N.Y.
  • communications device 102 includes a combination of devices, e.g. a smartphone combined with a digital audio player or portable media player.
  • a smartphone e.g. the iPhone family of smartphones manufactured by Apple, Inc.; a Samsung GALAXY family of smartphones manufactured by Samsung, Inc; or a Motorola DROID family of smartphones.
  • communications device 102 is a laptop or desktop computer equipped with a web browser and a microphone and speaker system, e.g. a telephony headset.
  • communications devices 102 are web-enabled and can receive and initiate phone calls.
  • a laptop or desktop computer is also equipped with a webcam or other video capture device that enables video chat and video call.
  • the status of one or more machines 102 , 106 in network 104 is monitored, generally as part of network management.
  • the status of a machine may include an identification of load information (e.g., the number of processes on the machine, CPU, and memory utilization), of port information (e.g., the number of available communication ports and the port addresses), or of session status (e.g., the duration and type of processes, and whether a process is active or idle).
  • the information may be identified by a plurality of metrics, and the plurality of metrics can be applied at least in part towards decisions in load distribution, network traffic management, and network failure recovery as well as any aspects of operations of the present solution described herein.
  • the following describes systems and methods for improving assessment of security risks that users pose to an organization based on their personal internet account data.
  • An organization does not usually have access to information related to a users' personal domain such as a users' personal email address. Also, the organization may not have any measure to detect the security awareness behavior of users in their personal domain. As a result, the full security risk that users pose to the organization may not be known to it. Furthermore, even if the organization has access to personal information of the users, it would most likely be available within a particular department, for example, a Human Resources (HR) department, and not available to any other department within the organization. In an example, the personal information of the users may not be available to a system administrator or an Information Technology (IT) department of the organization. The system administrator may be a professional (or a team of professionals) managing organizational cybersecurity aspects. The system administrator may oversee and manage IT systems of the organization.
  • HR Human Resources
  • IT Information Technology
  • a significant number of organization email breaches occur when users use their organization email address for registrations on websites in their personal domain.
  • users may use their organization email address to register to entertainment and media websites, for example, for gaming, travel, booking restaurants, and other personal purposes.
  • Such usage exposes the users' organization email address to risk, and it may be subjected to attacks or may become compromised.
  • organization login credentials e.g., one or more of an email address, a username, an email, or password
  • the user may unintentionally expose the organization login credentials to a risk of hijacking that may provide to an attacker access to organizational data.
  • FIG. 2 depicts an implementation of some of an architecture of an implementation of system 200 for determining a personal risk score of a user of an organization based on personal information of the user, according to some embodiments.
  • System 200 may include security awareness system 202 , user device 204 , email server 206 , one or more breach databases 208 1-N , and network 210 enabling communication between the system components for information exchange.
  • Network 210 may be an example or instance of network 104 , details of which are provided with reference to FIG. 1 A and its accompanying description.
  • security awareness system 202 may be implemented in a variety of computing systems, such as a mainframe computer, a server, a network server, a laptop computer, a desktop computer, a notebook, a workstation, and any other computing system.
  • security awareness system 202 may be implemented in a server, such as server 106 shown in FIG. 1 A .
  • security awareness system 202 may be implemented by a device, such as computing device 100 shown in FIGS. 1 C and 1 D .
  • security awareness system 202 may be implemented across a server cluster, thereby, tasks performed by security awareness system 202 may be performed by the plurality of servers. These tasks may be allocated among the server cluster by an application, a service, a daemon, a routine, or other executable logic for task allocation.
  • security awareness system 202 may facilitate cybersecurity awareness training, for example, via simulated phishing campaigns, computer-based trainings, remedial trainings, and risk score generation and tracking.
  • a simulated phishing campaign is a technique of testing a user to see whether the user is likely to recognize a true malicious phishing attack and act appropriately upon receiving the malicious phishing attack.
  • the user may be an employee of the organization, a customer, a vendor, or anyone associated with the organization.
  • the user may be an end-customer/consumer or a patron using the goods and/or services of the organization.
  • security awareness system 202 may execute the simulated phishing campaign by sending out one or more simulated phishing messages periodically or occasionally to the users and observe responses of the users to such simulated phishing messages.
  • a simulated phishing message may mimic a real phishing message and appear genuine to entice a user to respond/interact with the simulated phishing message.
  • the simulated phishing message may include links, attachments, macros, or any other simulated phishing threat that resembles a real phishing threat.
  • the user may be provided with security awareness training.
  • If and how the user interacts with the simulated phishing message may be logged and may impact a risk score of the user, a risk score of a team of which the user is part of, a risk score of the user's organization, and/or a risk score of an industry to which the user's organization belongs.
  • security awareness system 202 may be owned or managed or otherwise associated with an organization or any entity authorized thereof.
  • security awareness system 202 may be managed by a system administrator.
  • the system administrator may oversee and manage security awareness system 202 to ensure cybersecurity goals of the organization are met.
  • the system administrator may oversee Information Technology (IT) systems of the organization for managing simulated phishing campaigns and any other element within security awareness system 202 .
  • security awareness system 202 may be a Computer Based Security Awareness Training (CBSAT) system that performs security services such as performing simulated phishing campaigns on a user or a set of users of an organization as a part of security awareness training.
  • CBSAT Computer Based Security Awareness Training
  • user device 204 may be any device used by the user.
  • User device 204 as disclosed may be any computing device, such as a desktop computer, a laptop, a tablet computer, a mobile device, a Personal Digital Assistant (PDA) or any other computing device.
  • user device 204 may be a device, such as client device 102 shown in FIG. 1 A and FIG. 1 B .
  • User device 204 may be implemented by a device, such as computing device 100 shown in FIG. 1 C and FIG. 1 D .
  • email server 206 may be any server capable of handling and delivering emails over network 210 using one or more standard email protocols, such as Post Office Protocol 3 (POP3), Internet Message Access Protocol (IMAP), Simple Message Transfer Protocol (SMTP), and Multipurpose Internet Mail Extension (MIME) Protocol.
  • Email server 206 may be a standalone server or a part of an organization server.
  • Email server 206 may be implemented using, for example, Microsoft® Exchange Server, or HCL Domino®.
  • email server 206 may be server 106 shown in FIG. 1 A .
  • Email server 206 may be implemented by a device, such as computing device 100 shown in FIGS. 1 C and 1 D .
  • email server 206 may be implemented as a part of a server cluster.
  • email server 206 may be implemented across a plurality of servers, thereby, tasks performed by email server 206 may be performed by the plurality of servers. These tasks may be allocated among the server cluster by an application, a service, a daemon, a routine, or other executable logic for task allocation.
  • one or more breach databases 208 1-N may be dynamic databases that include public databases and/or private databases.
  • One or more breach databases 208 1-N may include information related to user login credentials of websites which have been breached. Examples of user login credentials may include a username, an email address and/or a password.
  • a username is a unique combination of characters, such as letters of the alphabet and/or numbers and/or non-alphanumeric symbols, that identify a specific user. The user may gain access to a website using the user login credentials.
  • security awareness system 202 may determine whether user login credentials including a username and/or an email address is/are associated with a data breach if the username and/or the email address is found in one or more breach databases 208 1-N .
  • information related to the user login credentials of the users stored in one or more breach databases 208 1-N may be periodically or dynamically updated as required.
  • security awareness system 202 may include processor 212 and memory 214 .
  • processor 212 and memory 214 of security awareness system 202 may be CPU 121 and main memory 122 , respectively, as shown in FIGS. 1 C and 1 D .
  • security awareness system 202 may include verification unit 216 , exposure check unit 218 , security audit unit 220 , risk score calculator 222 , remediation unit 224 , and detection unit 226 .
  • verification unit 216 , exposure check unit 218 , security audit unit 220 , risk score calculator 222 , remediation unit 224 , and detection unit 226 may be coupled to processor 212 and memory 214 .
  • verification unit 216 , exposure check unit 218 , security audit unit 220 , risk score calculator 222 , remediation unit 224 , and detection unit 226 amongst other units may include routines, programs, objects, components, data structures, etc., which may perform particular tasks or implement particular abstract data types.
  • Verification unit 216 , exposure check unit 218 , security audit unit 220 , risk score calculator 222 , remediation unit 224 , and detection unit 226 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • verification unit 216 may be implemented in hardware, instructions executed by a processing unit, or by a combination thereof.
  • the processing unit may comprise a computer, a processor, a state machine, a logic array, or any other suitable devices capable of processing instructions.
  • the processing unit may be a general-purpose processor that executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit may be dedicated to performing the required functions.
  • verification unit 216 , exposure check unit 218 , security audit unit 220 , risk score calculator 222 , remediation unit 224 , and detection unit 226 may be machine-readable instructions which, when executed by a processor/processing unit, perform any of the desired functionalities.
  • the machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk, or other machine-readable storage medium or non-transitory medium.
  • the machine-readable instructions may also be downloaded to the storage medium via a network connection.
  • the machine-readable instructions may be stored in memory 214 .
  • security awareness system 202 may include password storage 228 , user personal information storage 230 , risk score storage 232 , and website information storage 234 .
  • password storage 228 may include information about organization passwords of the users.
  • User personal information storage 230 may store information related to personal accounts of the users. In an example, information related to personal accounts of the users may include personal email addresses, usernames, passwords, and/or other information from the users' personal domain. According to an implementation, the users may voluntarily provide the information related to their personal accounts.
  • user personal information storage 230 may also store information about past or previous organizations of the user, such as previous organization email addresses.
  • risk score storage 232 may include security awareness profiles of the users and risk scores of the users (in some examples, the security awareness profile of the user may include of a risk score of the user).
  • a security awareness profile of a user may include information about the security awareness of the user and other information which is relevant for assessing the security awareness of the user.
  • a risk score of a user may include a representation of the susceptibility of the user to a malicious attack.
  • the risk score for a user may quantify a cybersecurity risk that the user poses to an organization. The risk score may also quantify the level of risk for a group of users, the organization, an industry to which the organization belongs, a geography, and any other categorization.
  • the risk score of the user may be modified based on the user's responses to simulated phishing messages, assessed user behavior, breached user information, completion of training by the user, a current position of the user in the organization, a size of a network of the user, an amount of time the user has held the current position in the organization, and/or any other attribute that can be associated with the user.
  • a higher risk score of the user indicates that a higher security risk is associated with the user and a lower risk score indicates a lower security risk and better security awareness.
  • website information storage 234 may store information related to personal domain websites.
  • website information storage 234 may store information about login pages for known or popular personal domain websites, email addresses associated with messages sent by those websites, or examples of registration email validation messages and promotional messages used by personal domain websites.
  • a registration email validation message is a message that is sent by a website (for example, a personal domain website) to an email address that a user input when registering with the website.
  • the registration email validation message may include a link for the user to click to validate that he or she is the owner of the email address and may include keywords such as “verify your email address,” “confirm your email,” and “activate your account.”
  • a promotional message may be an email message such as a weekly newsletter, a sale promotion email, and other promotional messages that a business distributes to promote their products, services, offers, campaigns, etc.
  • the promotional messages from various personal domain websites may have specific characteristics in common.
  • the promotional messages may include an unsubscribe link (for example, “Click here to unsubscribe from these emails”), discount details (for example, “up to 40% off”) and other such characteristics.
  • promotional messages may include keywords such as “discount,” “unsubscribe,” “sale,” “offer,” and “hurry.”
  • Information about organization passwords of the users stored in password storage 228 , personal information related to the users stored in user personal information storage 230 , security awareness profiles of the users and risk scores of the users stored in risk score storage 232 , and information about personal websites stored in website information storage 234 may be periodically or dynamically updated as required.
  • user device 204 may be any device used by a user.
  • the user may be an employee of an organization or any entity.
  • user device 204 may include processor 236 and memory 238 .
  • processor 236 and memory 238 of user device 204 may be CPU 121 and main memory 122 , respectively, as shown in FIGS. 1 C and 1 D .
  • User device 204 may also include user interface 240 such as a keyboard, a mouse, a touch screen, a haptic sensor, voice-based input unit, or any other appropriate user interface. It shall be appreciated that such components of user device 204 may correspond to similar components of computing device 100 in FIGS.
  • User device 204 may also include display 242 , such as a screen, a monitor connected to the device in any manner, or any other appropriate display.
  • user device 204 may display received content (for example, emails) for the user using display 242 and is able to accept user interaction via user interface 240 responsive to the displayed content.
  • user device 204 may include email client 244 .
  • email client 244 may be an application installed on user device 204 .
  • email client 244 may be an application that can be accessed over network 210 through a browser without requiring to be installed on user device 204 .
  • email client 244 may be any application capable of composing, sending, receiving, and reading email messages.
  • email client 244 may be an instance of an application, such as Microsoft OutlookTM application, HCL Notes application, IBM® Lotus Notes® application, Apple® Mail application, Gmail° application, or any other known or custom email application.
  • a user of user device 204 may be mandated to download and install email client 244 by the organization.
  • email client 244 may be provided by the organization as default.
  • a user of user device 204 may select, purchase and/or download email client 244 , through for example, an application distribution platform.
  • application as used herein may refer to one or more applications, services, routines, or other executable logic or instructions.
  • a user of the organization may be aware that his or her actions outside the organization may affect the security of the organization. In some instances, the user may be willing to improve his or her security awareness behavior. In an implementation, the user, or another person, for example, the users' manager in the organization, may place a request to security awareness system 202 for registration of the user's personal information.
  • a user, a user's manager, a system administrator or a security awareness system may initiate a request from security awareness system 202 to the user's device 204 causing the user's device 204 to display a request to the user, wherein the request to the user requests that the user register personal information with the security awareness system or the request to the user requests that the user sends personal information to the security awareness system.
  • the user may initiate the request by opting in for personal information registration.
  • security awareness system 202 may receive a request for registration of the personal information of the user, for example from a user's manager or a system administrator or other company official.
  • security awareness system 202 may initiate the request for registration of the personal information of the user on behalf of the user.
  • security awareness system 202 may prompt or provide an option to the user to register their personal information.
  • a user may provide personal information to security awareness system 202 voluntarily or in response to a prompt.
  • Security awareness system 202 may receive the personal information of the user, for example, one or more of a personal email address, a username, and a personal password or a password associated with a personal domain.
  • “registration” of personal information of the user at the security awareness system 202 is considered performed merely by the security awareness system 202 receiving personal information of the user.
  • security awareness system 202 may initiate verification of the personal information through verification unit 216 .
  • verification unit 216 may verify that the personal email address is within the personal domain of the user.
  • verification unit 216 may send a confirmation email to the personal email address provided by the user or for the user to verify whether the personal email address is controlled by the user.
  • a confirmation email may include a one-time code for the user to input into security awareness system 202 confirming access to the personal email address.
  • the one-time code may be valid for only one login session or transaction or may be valid only for a short period of time.
  • the one-time code may be valid for a period of time, such as 2 minutes.
  • the confirmation email may include a time-sensitive link that has to be followed or clicked within a specified time period (for example, 15 minutes) to complete the verification process.
  • the user may input the one-time code into security awareness system 202 or follow the provided link within the specified time period to validate the ownership of, or access to the personal email address.
  • the input of the correct one-time code or following of the supplied link within the specified time period enables verification unit 216 to verify that the personal email address provided for or by the user and/or the personal information provided for or by the user is owned or controlled by the user. Accordingly, verification unit 216 prevents a user from, for example, registering with another user's email address.
  • Other methods to confirm or verify that personal information registered with or sent to security awareness system 202 belongs to the user that are not discussed here are contemplated herein.
  • verification unit 216 may be configured to register and store personal information of the user in response to verification that the personal email address provided for or by the user is within the personal domain of the user. In another embodiment, verification unit 216 may register and store personal information provided for or by the user without verifying that the personal information belongs to the user. Verification unit 216 may be configured to store personal information of the user in user personal information storage 230 . In an implementation, for security and privacy reasons and/or to comply with respective privacy laws of corresponding countries, verification unit 216 may store personal information of the user in an encrypted, hashed, or obfuscated form. In some examples, verification unit 216 may store personal information of the user in a plain text or non-encrypted form. In some implementations, verification unit 216 may establish a link between a security awareness profile of the user and personal information of the user.
  • verification unit 216 may be configured to notify the system administrator that the user has provided or sent personal information to security awareness system 202 or has registered personal information with security awareness system 202 .
  • verification unit 216 may provide levels of visibility of personal information of a user to the system administrator.
  • personal information of one or more users may be fully visible to the system administrator.
  • personal information of the one or more users may be partially obscured. For example, an email address “user08@gmail.com” may be displayed to the system administrator as u*****@*****.com or another combination of obfuscated and actual characters. In some examples, the information may be unavailable to the system administrator.
  • the system administrator may be given an indication confirming that the user has registered his or her personal information with security awareness system 202 . Further, in some examples, the personal information may be unavailable to the system administrator, and the system administrator is not notified when the personal information is provided by the user. In some embodiments, access to passwords entered into security awareness system 202 may be restricted to non-humans, such as Artificial Intelligence (AI), operating logic, and other processing systems.
  • AI Artificial Intelligence
  • security awareness system 202 may perform an exposure check and/or a security audit on the personal information of the user.
  • exposure check unit 218 is configured to perform the exposure check of the personal information of the user by searching for user's personal information or credentials in one or more breach databases 208 1-N .
  • exposure check unit 218 may check if one or more usernames and/or email addresses registered by the user is found in the one or more breach databases 208 1-N thereby indicating that the one or more usernames and/or email addresses have been exposed in a security breach.
  • exposure check unit 218 may use at least one of an email address or a username in the personal information to search for breached user information in one or more breach databases 208 1-N .
  • exposure check unit 218 may separate the email address account name (user) from its domain name (@company.com) and perform an exposure check on the account name. For example, if the email address provided by the user is “user08@gmail.com,” then exposure check unit 218 may separate “user08” from “user08@gmail.com” and perform an exposure check using the account name “user08”.
  • exposure check unit 218 may check whether those passwords are associated with any known data breach. In an example, exposure check unit 218 may determine that a password is associated with a known data breach if the password is detected in one or more breach databases 208 1-N . In some examples, exposure check unit 218 may query one or more breach database 208 1-N to determine if the one or more passwords provided by the user have been compromised in a data breach. In some examples, exposure check unit 218 may provide the passwords in a query to one or more breach database 208 1-N .
  • security audit unit 220 may be configured to perform a security audit of the personal information of the user.
  • security audit unit 220 may perform the security audit by assessing a strength of the one or more stored personal passwords from the personal information and evaluating whether the passwords used for personal information comply with policies that the organization has for password strength.
  • security audit unit 220 may assess the personal passwords to determine if there is password reusage or password sharing.
  • password reuse refers to a same user using the same password to log in to more than one account and password sharing refers to a scenario where a password of a user of an organization is identified as identical to or similar to a password of another user of the organization.
  • security audit unit 220 may assess the strength of the one or more personal passwords based on some standards, such as National Institute of Standards and Technology (NIST) standards provided below.
  • NIST National Institute of Standards and Technology
  • security audit unit 220 may compare the one or more registered personal passwords and identify reused passwords or derived passwords.
  • “R@31F”, “password1”, “Welcome!”, “password2” and “PaSsWoRd” may be poor personal passwords provided by the user.
  • a reused password may be a password that has been used previously or a password having similarity to a certain degree to a password that has been used previously.
  • security audit unit 220 may identify “password1”, “password2”, and “PaSsWoRd” as reused when compared to the password “password”.
  • Security audit unit 220 may use one or more tools to create permutations of one or more passwords to use as search terms.
  • An example of one such tool is the “Bad Password Generator” tool, available via the website of bad.pw and created by Harold Zang, referred to as SpZ (at spz.io).
  • security audit unit 220 may perform a search for the registered personal passwords within password storage 228 to identify incidents of password sharing.
  • security audit unit 220 may compare the personal passwords with other passwords within password storage 228 .
  • security audit unit 220 may search within the organization's Active Directory, or other corporate databases, for a match to passwords that the user is using within the organization.
  • one or more breach databases 208 1-N and password storage 228 may be continually or periodically monitored for similarity or match to the user's personal information.
  • the results of the exposure check and/or the results of the security audit may be provided to the system administrator.
  • security audit unit 220 may provide the user with a report on threats, breaches, and poor password hygiene associated with a personal domain as an incentive for registering his or her personal information with the organization, thus enabling the user to take appropriate actions to protect their personal information.
  • security audit unit 220 may generate the report based on information determined by exposure check unit 218 and/or security audit unit 220 and in a further example, security audit unit 220 may generate the report based on searching breach data sites, such as “https://haveibeenpwned.com/” and “https://spycloud.com/” and dark web sources.
  • security audit unit 220 may apply some or all of the security audit process to create a report for the user.
  • the report may enable the user to gain a greater understanding of the risks that activities in the personal domain create for his or her personal information. Further, the user may be motivated to share further personal information with the organization so that the user can be informed about the risks associated with activities in the personal domain.
  • risk score calculator 222 may be configured to calculate one or more risk scores for the user.
  • individual risk scores include a personal risk score and an organization risk score.
  • Other examples of the individual risk scores that are not discussed here are contemplated herein.
  • an organization risk score for a user is a component of the risk score which is attributed to data held within the organization as a part of the ongoing employment of the user
  • a personal risk score for the user is a component of the risk score which is attributed to the user's personal information and habits/behaviors within the personal domain.
  • the personal risk score may also reflect a willingness of the user to provide the personal information to the organization that the user is not obliged to provide.
  • risk score calculator 222 may calculate each risk score individually or in another example, risk score calculator 222 may calculate a single, overall risk score for the user taking into account contributions from each area of risk. In some embodiments, risk score calculator 222 may be configured to calculate or adjust a personal risk score of the user according to an analysis of the user's personal information. In an example, when the user registers his or her personal information with security awareness system 202 , risk score calculator 222 may set the personal risk score of the user at a level that indicates an action of opting-in. According to an embodiment, the amount of personal information that the user provides may affect the personal risk score of the user.
  • the personal risk score for the user may be set lower than if the user had provided a single email address. Further, in an example, if the user provides password information associated with the email address then the personal risk score may be set lower than if the user had provided only the email address. Many such combinations are contemplated herein to set or modify the personal risk score.
  • security awareness system 202 may provide a notification to the system administrator that the personal risk score of the user is managed by security awareness system 202 but is not configured to be visible to the system administrator.
  • risk score calculator 222 may calculate a personal risk score for the user based on various factors such as whether the user has low or negligible personal information exposure, whether the user has moderate personal information exposure, and whether the user has high personal information exposure. In an example, these factors may be placed on an importance/weight scale, for example, from 1 to 10. In an implementation, risk score calculator 222 may assign an example weight range “0-3” to the factor “the user has negligible personal information exposure”, an example weight range “4-7” to the factor “the user has moderate personal information exposure”, and an example weight range “8-10” to the factor “the user has high personal information exposure”.
  • the calculated value of the personal risk score may be based on a threshold of a number of instances that the user's personal information was found in a breach, the strength of the passwords provided by the user, whether the breach was found in one or more breach databases 208 1-N , whether the database containing the breach was a database solely enabled by security awareness system 202 or a public database, or whether the information was found through the exposure check or the security audit.
  • risk score calculator 222 may adjust the personal risk score of the user based at least on a result of the exposure check and/or the security audit.
  • the personal risk score for the user may be updated based on a weighted risk score for each instance of breach associate with the user's personal information.
  • each instance of a breached username of user may only be counted once towards the user's personal risk score
  • each instance of a breached password of the user may be counted 1.5 times towards the user's personal risk score
  • each instance of a username of the user with a password of the user from the same breach may be counted 2 times towards the user's personal risk score.
  • the personal risk score is incorporated into a risk score of the user that is calculated to determine the propensity that the user may respond to a malicious attack.
  • Risk score calculator 222 may also calculate an organization risk score for the organization of user. A description of such a system may be found in U.S. Pat. No. 10,673,876. In an example, risk score calculator 222 may calculate an organization risk score for the organization of user based on whether the user has interacted with malicious attacks in the past, whether the user received a high number of malicious attacks, and/or/ whether a job title of the user gives him or her expanded access to an organization network, any of which may pose a risk to the organization of the user. Examples of calculation of other types of risk scores that are not discussed here are contemplated herein and may be carried out by risk score calculator 222 .
  • risk score calculator 222 may combine a personal risk score of a user with an organization risk score of the user's organization to generate a risk score.
  • risk score calculator 222 may combine a personal risk score of a user with an organization risk score of the user's organization using an algorithm or algorithms.
  • an example of such an algorithm may be a weighting algorithm which may be adjusted depending on the severity of the breach associated with the user, the kind of breach associated with the user, or any other metric measured by security awareness system 202 .
  • a personal risk score of the user and the organization risk score of the user's organization may be stored in risk score storage 232 .
  • security awareness system 202 may send a notification to a system administrator about a personal risk score of the and the organization risk score of the user's organization being combined.
  • a personal risk score of a user, organization risk score of the user's organization, or any other risk score contemplated may be used separately by security awareness system 202 .
  • risk score calculator 222 is further configured to determine an risk score of the user based at least on the personal risk score of the user.
  • one or more risk scores in addition to a personal risk score of a user and organization risk score of a user's organization may be contemplated.
  • these one or more additional risk scores may be combined with each other, with the personal risk score of the user and/or with organization risk score of the user's organization, for example according to an algorithm performed by risk score calculator 222 .
  • remediation unit 224 may be configured to perform remedial training directed to a user based on at least a personal risk score of the user. For example, remediation unit 224 may be configured to perform remedial training if a user's personal risk score exceeds a set level or if the user's personal risk score increases. In an example, remediation unit 224 may perform remedial training by tailoring training content to educate the user. In examples, remediation unit 224 may provide training to the user via a landing page hosted by security awareness system 202 . In an example, the landing page may be a web page that enables provisioning of training materials to the user. For example, the landing page may provide to the user training related to choosing strong passwords and avoiding password reuse and password sharing.
  • remediation unit 224 may prompt a user to change one or more passwords.
  • remediation unit 224 may prompt a user to acknowledge a username, password and/or email address associated with the user that has been included in a breach.
  • remediation unit 224 may require the user to render secure one or more of his or her organizational and/or personal accounts by changing one or more passwords.
  • remediation unit 224 may request the user to confirm that the user has secured one or more accounts.
  • remediation unit 224 may provide a recommendation to the user to improve one or more of their personal passwords. Further, in an implementation, remediation unit 224 may recommend and facilitate remedial training to a user via an interaction with the user.
  • remediation unit 224 may communicate and/or interact with the user using a pop-up message.
  • a pop-up message may be understood to refer to the appearance of graphical or textual content displayed to a user.
  • a message prompting a user to change one or more passwords of the user may be presented on a display as part of, or within, a “window” or a user interface element or a dialogue box.
  • Other known examples and implementations of pop-up messages are contemplated herein as part of this disclosure.
  • remediation unit 224 may send one or more communications to a user to determine whether the user has completed training, changed one or more passwords (for example, one or more personal passwords of the user, which in examples cannot be viewed by security awareness system 202 ), or whether the user has completed any other form of remediation.
  • the user may update personal information of the user with security awareness system 202 .
  • feedback to the user on one or more of security issues, security recommendations, and security remediations may be provided so as to protect the security and privacy of the user and to ensure that it is not possible to infer information about other users in the organization based on information of the user.
  • a personal password of a user of an organization if a personal password of a user of an organization is identified as similar to the personal password of another user of the organization, the personal password or similar personal passwords may be flagged to the users as a compromised password. In examples the users may be required to change the same or similar personal passwords.
  • indications, recommendations, or requirements made by remediation unit 224 to a user regarding reuse or security of the user's personal information are determined in a manner that ensures that the privacy of users of the security awareness system 202 is maintained.
  • risk score calculator 222 may adjust a risk score for a user based on one or more actions that the user takes after remediation. I examples, if a user is prompted to change one or more of his or her organization passwords or personal password, risk score calculator 222 may adjust the risk score of the user based on the user's actions in response to the prompt. In an example, risk score calculator 222 may adjust a risk score of the user according to the timeframe in which the user performed an action in response to a prompt. In an implementation, risk score calculator 222 may determine whether a user changed one or more organizational password based on an interaction of the user with the organization's password system, for example, an Active Directory.
  • a user may register for a personal domain website using his or her organization login credentials, such as the user's organization email address and/or the user's password.
  • detection unit 226 may detect whether the user has registered for the personal domain website using his or her organization login credentials as previously described.
  • detection unit 226 may continuously or periodically monitor a mailbox of the user for certain types of email messages, such as registration email validation messages and promotional messages that are typically sent from personal domain websites.
  • security awareness system 202 and detection unit 226 may use, and/or build and and/or maintain a database of email addresses, registration email validation messages and/or promotional messages that are known to be used by examples of personal domain websites to communicate with registered users of the personal domain website.
  • a database may be website information storage 234 .
  • detection unit 226 may monitor a mailbox of a user for messages from email addresses within website information storage 234 . If any such email in the mailbox of the user is found within website information storage 234 , detection unit 226 may determine that the user has registered for a personal domain website using the user's organization email address.
  • detection unit 226 may maintain a list of email addresses used by personal domain websites, such as “amazon.com” and “twitter.com”. For example, an email address associated with the personal domain website “amazon.com” may be “store-news@amazon.com” and an email address associated with the personal domain website “twitter.com” may be “info@twitter.com.” In examples, if an email from the email address “info@twitter.com” is found in a mailbox of a user, detection unit 226 may determine that the user has registered for the personal domain website “twitter.com” using the user's organization email address.
  • detection unit 226 may use a database of examples of registration email validation messages and promotional messages used by personal domain websites to communicate with registered users and may further build a query or queries with the aim of detecting the same or similar messages within mailbox of user.
  • detection unit 226 may determine key content segments which are typical of examples of registration email validation messages and promotional messages used by personal domain websites to communicate with registered users and may combine key content segments together in one or more queries.
  • Detection unit 226 may assign a search score for one or more emails in a user's mailbox based on one or more queries and determine from the search score the likelihood of an email being either a registration email validation message or a promotional message. In an example, if three key content segments are found in an email, then a search score for the message may be 3.
  • a search score for the email may be 9.
  • detection unit 226 may generate a search score for each of the one or more emails found in the user's mailbox.
  • a threshold score of the search score for an email for the email to be classified as a registration email validation message may be different from a threshold score of the search score for an email for the email to be classified as a promotional messages.
  • a threshold score for the search score of an email for the email to be classified as a registration email validation message may be 4 and the threshold score for the search score of an email for the email to be classified as a promotional message may be 8.
  • detection unit 226 may determine that the email is a registration email validation message. Further, if an email found in a mailbox of the user has a search score of 9, then detection unit 226 may determine that the email is a promotional message.
  • security awareness system 202 and detection unit 226 may use, build and/or maintain a list of login pages for known or popular personal domain websites, such as food delivery service websites (for example, such as Door Dash® and Uber Eats®), travel websites (for example, Hotels.com® and AirbnbTM), shopping websites (for example, amazon.com and walmart.com), and other such websites.
  • detection unit 216 may store the list of login pages for known or popular personal domain websites in website information storage 234 .
  • detection unit 226 may attempt to login to each of the personal domain websites' login pages using the organization login credentials of the user. If a login attempt with the user's organization login credentials is successful for a personal domain website, detection unit 226 may determine that the user has registered for the personal domain website using his or her organization login credentials.
  • detection unit 226 may access a reset password link of one or more accounts of personal domain websites which in examples presents an interface to enter the email address associated with the forgotten password.
  • Detection unit 226 may provide the user's organization email address as the email address for the account and if an email is sent to the user's organization email address with a password reset link, then detection unit 226 may determine that the user has registered for the personal domain website using the user's organization login credentials.
  • detection unit 226 may use this approach for randomized or periodic checks of one or more users, rather than checking all the users in the organization at the same time.
  • detection unit 226 may use a plurality of IP addresses to access a reset password link of one or more accounts of personal domain websites to avoid appearing as a DoS attack.
  • security awareness system 202 may delete the detected emails.
  • security awareness system 202 may prompt or require a user to change his or her organization password or personal passwords.
  • security awareness system 202 may deliver remedial training to the user.
  • detection unit 226 may interact with email server 206 and trigger email server 206 to disable the user's organization password and require the user to create a new organization login credentials, for example a new organization password, for example, before the user is able to access various servers and services of the organization.
  • security awareness system 202 may implement monitoring processes in a manner that respects the privacy laws of a country in which the organization of the user and/or the user is located.
  • security awareness system 202 may obfuscate a user's personal information from a system administrator, and the user may have the option to manage the amount of personal information that is to be made visible to the system administrator.
  • security awareness system 202 may remove the user's personal information for example as a best practice and/or to comply with various privacy regulations such as General Data Protection Regulation (GDPR).
  • GDPR General Data Protection Regulation
  • a user may choose to de-register his or her personal information from security awareness system 202 at any time.
  • all personal information and other relevant records pertaining to the user may be removed or deleted from security awareness system 202 .
  • a personal risk score of the user may be removed from risk score storage 232 , however the effect that the personal risk score of the user contributed to the risk score of the user may persist despite the user de-registering their personal information.
  • some or all of the component of risk score of the user contributed by the personal risk score of the user f is reversed, if the user de-register his or her personal information from security awareness system 202 .
  • FIG. 3 depicts flowchart 300 for detecting that the user has registered for a personal domain website using an organization email address, according to some embodiments.
  • Step 302 includes generating a query using key content segments determined based on at least one of known registration email validation messages and known promotional messages from personal domain websites.
  • registration email validation messages and promotional messages from various personal domain websites may include certain key content segments in common which may be combined and reused to generate a query.
  • key content segments used to build queries in the past may be altered to increase the probability of detecting one or more registration email validation messages and/or promotional message.
  • detection unit 226 may generate a query of key content segments that appear in the known registration email validation messages and promotional messages from personal domain websites.
  • detection unit 226 may alter key content segments used to build queries in the past to increase the probability of detecting one or more registration email validation messages and/or promotional messages based on characteristics of known or sample registration email validation messages and/or promotional messages.
  • detection unit 216 may apply an AI model to generate the query.
  • Step 304 includes monitoring, by detection unit 226 using a generated query, a mailbox of a user to detect one or more emails which include key content segments.
  • detection unit 226 may be configured to monitor one or more or all folders of the mailbox of the user, including for example an inbox folder, a junk email folder, a deleted items folder, and one or more spam email folders.
  • detection unit 226 may be configured to train an AI model to detect one or more emails which include key content segments, for example using sample or known registration email validation messages and/or promotional messages, the AI model to be used for the purpose of recognizing registration email validation messages and/or promotional in a user's mailbox.
  • Step 306 includes generating, for example by detection unit 226 , a search score for one or more emails in the user's mailbox.
  • the search score is based on key content segments found in the one or more emails.
  • detection unit 226 may generate a search score for all emails found in the user's mailbox.
  • detection unit 226 may generate a search score for one or more emails found in the user's mailbox that include a minimum number of key content segments.
  • Step 308 includes determining an email from the mailbox of the user to be one of a registration email validation message or a promotional message based.
  • detection unit 226 determines one or more emails from the mailbox of the user to be one of a registration email validation message or a promotional message based on the search scores of the one or more emails exceeding a threshold score.
  • detection unit 226 may determine an email from the one or more folders of the mailbox of the user to be one of a registration email validation message or a promotional message based on the search score of the emails exceeding a threshold score.
  • detection unit 226 may determine that an email from the one or more folders of the mailbox of the user is a registration email validation message based on the search score of the email being less than, equal to, or greater than a first threshold score, and detection unit 226 may determine that an email from the one of more folders of the mailbox of the user is a promotional message based on the search score of the email being less than, equal to, or greater than a second threshold.
  • Step 310 includes performing one or more actions based on determining that the email is one of a registration email validation message or a promotional message.
  • the one or more actions may be performed by remediation unit 224 .
  • the one or more actions performed by remediation unit 224 based on determining that the email is one of a registration email validation message or a promotional message. include one or more of deleting the email from the user's mailbox, prompting the user to change an organization password, and providing training to the user on personal domain use of organization login credentials.
  • detection unit 226 determines that the email is one of a registration email validation message or a promotional message
  • remediation unit 224 may perform one or more actions separately or in combination.
  • the more than one actions performed by remediation unit 224 are performed at different times, for example separated by a minimum time period.
  • security awareness system 202 may remove the registration email validation message or the promotional message from the user's mailbox such that the user may not validate the personal domain registration through interacting with the registration email validation message, or such that the user may not interact with the promotional message.
  • security awareness system 202 may prompt or require the user to change their organization password.
  • detection unit 226 may interact with email server 206 and trigger email server 206 to disable the user's organization password and require the user to create a new organization password.
  • remediation unit 224 may provide training to the user.
  • FIG. 4 depicts flowchart 400 for using personal information for determining a personal risk score of the user of the organization, according to some embodiments.
  • Step 402 includes receiving registration of personal information of a user of an organization.
  • security awareness system 202 may send a request to the user asking the user to provide his or her personal information.
  • the user may be asked to voluntarily provide his or her personal information and the user may choose to provide his or her personal information or may choose not to provide his or her personal information.
  • the user may be required to provide some personal information in order for the user to gain access to one or more services or systems of the organization.
  • the user may opt to provide the personal information to security awareness system 202 .
  • a user's personal information may include one or more personal email addresses of the user.
  • a user's personal information may include one or more personal usernames of the user.
  • a user's personal information may include on or more personal passwords of the user.
  • Step 404 includes performing at least one of an exposure check or a security audit of the personal information of the user.
  • exposure check unit 218 may be configured to perform an exposure check on some or all of the personal information of the user that was provided by the user.
  • exposure check unit may perform an exposure check on some or all of the personal information of the user by searching for one or more of a user's personal email address, a user's personal username, or a user's personal password in one or more breach databases 208 1-N .
  • exposure check unit 218 may be configured to store the results of one or more exposure checks on the personal information provided by the user in memory, for example memory 214 or risk score storage 232 or personal information storage 230 .
  • security audit unit 220 may be configured to perform a security audit on some or all of the personal information of the user that was provided by the user. In examples, security audit unit 220 performs a security audit of one or more personal passwords of the user by assessing a strength of the one or more personal passwords of the user. In examples, security audit unit 220 performs a security audit of one or more personal passwords of the user by comparing the one or more personal passwords of the user to password requirements of the organization. In examples, security audit unit 220 may be configured to store the results of one or more security audits on personal information provided by the user in memory, for example memory 214 or risk score storage 232 or personal information storage 230 .
  • Step 406 includes adjusting a personal risk score of the user based at least on a result of one or more of an exposure check and/or a security audit on personal information provided by the user.
  • risk score calculator 222 may be configured to generate a personal risk score for a user based at least on the result of one of an exposure check or a security audit on personal information provided by the user.
  • risk score calculator 222 may be configured to adjust a previously determined personal risk score of a user based at least on the result of one of the exposure check or the security audit on personal information provided by the user.
  • risk score calculator 222 may adjust the personal risk score of the user based at least on the user voluntarily registering personal information with security awareness system 202 .
  • risk score calculator 222 may determine a risk score for a user based at least on the personal risk score of the user.
  • FIG. 5 depicts flowchart 500 for performing a remedial training or a simulated phishing campaign directed to the user based on a personal risk score of the user, according to some embodiments.
  • Step 502 includes receiving personal information of a user of an organization, for example as described previously in step 402 of FIG. 4 .
  • security awareness system 202 may send a request to the user to provide his or her personal information. Upon receiving the request or otherwise, the user may opt to provide personal information to security awareness system 202 .
  • the personal information provided by the user of the organization includes one or more personal email addresses.
  • Step 504 includes verifying a user's personal email address is used in a personal domain of the user.
  • verification unit 216 may verify the email address identified by the personal information and used in the personal domain of the user by attempting to login in to the personal domain with one or more of the personal email addresses provided by the user.
  • Step 506 includes storing the personal information of the user in response verification unit 216 verifying that one or more personal email address of the user has been used in the personal domain of the user.
  • verification unit 216 may be configured to store the personal information of the user in response to the email address used in the personal domain of the user being verified.
  • verification unit 216 may store the email address used in the personal domain of the user in user personal information storage 230 .
  • the email address may be stored in an encrypted or obfuscated form.
  • verification unit 216 may associate the email address used in the personal domain of the user with a security awareness profile stored in risk score storage 232 .
  • access to the one or more personal email addresses of the user stored in user personal information storage 230 may be limited to some services or systems of the user's organization.
  • Step 508 includes performing at least one of an exposure check or a security audit of the personal information of the user, for example as was described in step 404 of FIG. 4 .
  • exposure check unit 218 may be configured to perform the exposure check by searching using at least one personal email address or personal username of the user provided by the user in the user's personal information in one or more breach databases 208 1-N .
  • security audit unit 220 may be configured to perform a security audit by assessing a strength of one or more personal passwords from the user's personal information and compliance of the one or more personal passwords of the user to the password requirements of the organization.
  • Step 510 includes adjusting a personal risk score of the user based at least on a result of one of the exposure check or the security audit, for example as described in step 406 of FIG. 4 .
  • risk score calculator 222 may be configured to adjust the personal risk score of the user based at least on the result of one of the exposure check or the security audit.
  • risk score calculator 222 may adjust the personal risk score of the user based at least on the user's registration of the personal information with security awareness system 202 .
  • risk score calculator 222 may determine an overall risk score for the user based at least on the personal risk score of the user.
  • Step 512 includes performing one of a remedial training or a simulated phishing campaign directed to the user based on the personal risk score of the user.
  • remediation unit 224 may be configured to perform one or more remedial training and security awareness system 202 may be configured to perform one or more simulated phishing campaign directed to the user based on a set level, an increase, or a decrease in the user's personal risk score.

Abstract

Systems and methods are described for improving assessment of security risk based on a user's personal information. Registration of personal information of a user of an organization is received at a security awareness system. Post receiving the registration of the personal information, at least one of an exposure check or a security audit of the personal information of the user is performed by the security awareness system. A personal risk score of the user is then generated or adjusted based at least on a result of one of the exposure check or the security audit.

Description

    RELATED APPLICATION
  • This patent application is a Continuation of and claims the benefit of and priority to U.S. Non-Provisional patent application Ser. No. 17/546,676 titled “SYSTEMS AND METHODS FOR IMPROVING ASSESSMENT OF SECURITY RISK BASED ON PERSONAL INTERNET ACCOUNT DATA,” and filed Dec. 9, 2021, which claims the benefit of and priority to U.S. Provisional Patent Application No. 63/123,812 titled “SYSTEMS AND METHODS FOR IMPROVING ASSESSMENT OF SECURITY RISK BASED ON PERSONAL INTERNET ACCOUNT DATA,” and filed Dec. 10, 2020, which also claims the benefit of and priority to U.S. Provisional Patent Application No. 63/142,071 titled “SYSTEMS AND METHODS FOR IMPROVING ASSESSMENT OF SECURITY RISK BASED ON PERSONAL INTERNET ACCOUNT DATA,” and filed Jan. 27, 2021, the contents of all of which are hereby incorporated herein by reference in its entirety for all purposes
  • TECHNICAL FIELD
  • The present disclosure relates to systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data.
  • BACKGROUND
  • Cybersecurity incidents such as phishing attacks may cost organizations in terms of the loss of confidential and/or important information and expense in mitigating losses due to breaches of confidential information. Such incidents can also cause customers to lose trust in the organizations. The incidents of cybersecurity attacks and the costs of mitigating the damage caused are increasing every year. Organizations invest in cybersecurity tools such as antivirus, anti-ransomware, anti-phishing, and other platforms. Such cybersecurity tools may detect and intercept known cybersecurity attacks. However, social engineering attacks or new threats may not be readily detectable by such tools, and the organizations may have to rely on their employees to recognize such threats. A social engineering attack is an attack that exploits human behavior to gain access to an organization's systems through attack vectors such as phishing emails. A phishing email may include content to be presented to a user, where the content is chosen to convince the user that the phishing email is genuine and that the user should interact with it. The more contextually relevant and personal the content is to the user, the higher is the likelihood that the user will interact with it.
  • Among the cybersecurity attacks, organizations have recognized phishing attacks and social engineering attacks as one of the most prominent threats that can cause serious data breaches, including confidential information such as intellectual property, financial information, organizational information, and other important information. Attackers who launch phishing attacks and social engineering attacks may attempt to evade an organization's security apparatuses and tools and target its employees. To prevent or to reduce the success rate of phishing attacks on employees, the organizations may conduct security awareness training programs for their employees, along with other security measures. Through security awareness training programs, the organizations actively educate their employees on how to spot and report a suspected phishing attack. These organizations may operate the security awareness training programs through their in-house cybersecurity teams or may utilize third-party entities who are experts in cybersecurity matters to conduct such training. The employees may follow best practices of cybersecurity hygiene and comply with security regulations while working in offices. At times, for example, while working remotely, the employees may not follow the best practices of cybersecurity hygiene and may not comply with security regulations, partly due to the fact that the employees do not feel watched.
  • An organization's security may also be affected by its employees' behavior outside the organization. For example, the employees' behavior when in their personal domain may directly or indirectly jeopardize the organization's security. Examples of such behaviors include using the organization's email address for personal purposes and password reuse between work and personal accounts.
  • SUMMARY
  • The present disclosure generally relates to systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data or other personal information.
  • Systems and methods are provided for using personal information of a user for determining a personal risk score of the user of an organization. In an example embodiment, a method of using personal information of a user for determining a personal risk score of the user of an organization is described, which includes receiving, by a security awareness system configured on one or more servers, personal information of a user or registration of personal information of a user of an organization, performing, by the security awareness system, at least one of an exposure check or a security audit of the personal information of the user, and adjusting, by the security awareness system, a personal risk score of the user based at least on a result of one of the exposure check or the security audit.
  • In some implementations, the method includes, verifying, by the security awareness system, an email address identified by the personal information as used in a personal domain of the user.
  • In some implementations, the method includes, storing, by the security awareness system, the email address used in the personal domain of the user in association with a profile of the user for the security awareness system.
  • In some implementations, the method includes registering the personal information with the security awareness system in response to the email address used in the personal domain of the user being verified.
  • In some implementations, the method includes storing, by the security awareness system, the personal information in an obfuscated form.
  • In some implementations, the method includes performing an exposure check by searching using at least one of an email address or a username in the personal information for breached user information in one or more breach databases.
  • In some implementations, the method includes performing a security audit by assessing a strength of one or more registered personal passwords from the personal information and compliance to password requirements of the organization.
  • In some implementations, the method includes adjusting the personal risk score of the user based on at least the user's registration of the personal information with the security awareness system or based on the security awareness system receiving the user's personal information.
  • In some implementations, the method includes determining by the security awareness system a risk score based at least on the personal risk score of the user.
  • In some implementations, the method includes performing by the security awareness system, based on at least the personal risk score of the user, one of a remedial training or a simulated phishing campaign directed to the user.
  • In another example implementation, the security awareness system is configured to determine when a user has registered for a web site in a personal domain using his or her organization login credentials based on monitoring a mailbox of the user. In an implementation, the security awareness system is configured to determine whether the user has registered for the website using current organization login credentials or previous organization login credentials. The security awareness system provides training to the user about a safe use of the organization login credentials.
  • Other aspects and advantages of the disclosure will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate by way of example, the principles of the disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other objects, aspects, features, and advantages of the disclosure will become more apparent and better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1A is a block diagram depicting an embodiment of a network environment comprising client devices in communication with server devices, according to some embodiments;
  • FIG. 1B is a block diagram depicting a cloud computing environment comprising client devices in communication with cloud service providers, according to some embodiments;
  • FIGS. 1C and 1D are block diagrams depicting embodiments of computing devices useful in connection with the methods and systems described herein, according to some embodiments;
  • FIG. 2 depicts an implementation of some of an architecture of a system for determining a personal risk score of a user of an organization based on personal information of the user, according to some embodiments;
  • FIG. 3 depicts a flowchart for detecting that the user has registered for a personal domain website using an organization email address, according to some embodiments;
  • FIG. 4 depicts a flowchart for using personal information for determining the personal risk score of the user of the organization, according to some embodiments; and
  • FIG. 5 depicts a flowchart for performing a remedial training or a simulated phishing campaign directed to the user based on the personal risk score of the user, according to some embodiments.
  • DETAILED DESCRIPTION
  • For the purposes of reading the description of the various embodiments below, the following descriptions of the sections of the specifications and their respective contents may be helpful:
  • Section A describes a network environment and computing environment which may be useful for practicing embodiments described herein.
  • Section B describes embodiments of systems and methods for improving assessment of security risk that users pose to an organization based on their personal internet account data.
  • A. Computing and Network Environment
  • Prior to discussing specific embodiments of the present solution, it may be helpful to describe aspects of the operating environment as well as associated system components (e.g. hardware elements) in connection with the methods and systems described herein. Referring to FIG. 1A, an embodiment of a network environment is depicted. In a brief overview, the network environment includes one or more clients 102 a-102 n (also generally referred to as local machines(s) 102, client(s) 102, client node(s) 102, client machine(s) 102, client computer(s) 102, client device(s) 102, endpoint(s) 102, or endpoint node(s) 102) in communication with one or more servers 106 a-106 n (also generally referred to as server(s) 106, node(s) 106, machine(s) 106, or remote machine(s) 106) via one or more networks 104. In some embodiments, client 102 has the capacity to function as both a client node seeking access to resources provided by a server and as a server providing access to hosted resources for other clients 102 a-102 n.
  • Although FIG. 1A shows a network 104 between clients 102 and the servers 106, clients 102 and servers 106 may be on the same network 104. In some embodiments, there are multiple networks 104 between clients 102 and servers 106. In one of these embodiments, network 104′ (not shown) may be a private network and a network 104 may be a public network. In another of these embodiments, network 104 may be a private network and a network 104′ may be a public network. In still another of these embodiments, networks 104 and 104′ may both be private networks.
  • Network 104 may be connected via wired or wireless links. Wired links may include Digital Subscriber Line (DSL), coaxial cable lines, or optical fiber lines. Wireless links may include Bluetooth®, Bluetooth Low Energy (BLE), ANT/ANT+, ZigBee, Z-Wave, Thread, Wi-Fi®, Worldwide Interoperability for Microwave Access (WiMAX®), mobile WiMAX®, WiMAX®-Advanced, NFC, SigFox, LoRa, Random Phase Multiple Access (RPMA), Weightless-N/P/W, an infrared channel, or a satellite band. The wireless links may also include any cellular network standards to communicate among mobile devices, including standards that qualify as 1G, 2G, 3G, 4G, or 5G. The network standards may qualify as one or more generations of mobile telecommunication standards by fulfilling a specification or standards such as the specifications maintained by the International Telecommunication Union. The 3G standards, for example, may correspond to the International Mobile Telecommuniations-2000 (IMT-2000) specification, and the 4G standards may correspond to the International Mobile Telecommunication Advanced (IMT-Advanced) specification. Examples of cellular network standards include AMPS, GSM, GPRS, UMTS, CDMA2000, CDMA-1xRTT, CDMA-EVDO, LTE, LTE-Advanced, LTE-M1, and Narrowband IoT (NB-IoT). Wireless standards may use various channel access methods, e.g. FDMA, TDMA, CDMA, or SDMA. In some embodiments, different types of data may be transmitted via different links and standards. In other embodiments, the same types of data may be transmitted via different links and standards.
  • Network 104 may be any type and/or form of network. The geographical scope of the network may vary widely and network 104 can be a body area network (BAN), a personal area network (PAN), a local-area network (LAN), e.g. Intranet, a metropolitan area network (MAN), a wide area network (WAN), or the Internet. The topology of network 104 may be of any form and may include, e.g., any of the following: point-to-point, bus, star, ring, mesh, or tree. Network 104 may be an overlay network which is virtual and sits on top of one or more layers of other networks 104′. Network 104 may be of any such network topology as known to those ordinarily skilled in the art capable of supporting the operations described herein. Network 104 may utilize different techniques and layers or stacks of protocols, including, e.g., the Ethernet protocol, the internet protocol suite (TCP/IP), the ATM (Asynchronous Transfer Mode) technique, the SONET (Synchronous Optical Networking) protocol, or the SDH (Synchronous Digital Hierarchy) protocol. The TCP/IP internet protocol suite may include application layer, transport layer, internet layer (including, e.g., IPv4 and Ipv6), or the link layer. Network 104 may be a type of broadcast network, a telecommunications network, a data communication network, or a computer network.
  • In some embodiments, the system may include multiple, logically grouped servers 106. In one of these embodiments, the logical group of servers may be referred to as a server farm, a server cluster, or a machine farm. In another of these embodiments, servers 106 may be geographically dispersed. In other embodiments, a machine farm may be administered as a single entity. In still other embodiments, the machine farm includes a plurality of machine farms. Servers 106 within each machine farm can be heterogeneous—one or more of servers 106 or machines 106 can operate according to one type of operating system platform (e.g., Windows, manufactured by Microsoft Corp. of Redmond, Wash.), while one or more of the other servers 106 can operate according to another type of operating system platform (e.g., Unix, Linux, or Mac OSX).
  • In one embodiment, servers 106 in the machine farm may be stored in high-density rack systems, along with associated storage systems, and located in an enterprise data center. In the embodiment, consolidating servers 106 in this way may improve system manageability, data security, the physical security of the system, and system performance by locating servers 106 and high-performance storage systems on localized high-performance networks. Centralizing servers 106 and storage systems and coupling them with advanced system management tools allows more efficient use of server resources.
  • Servers 106 of each machine farm do not need to be physically proximate to another server 106 in the same machine farm. Thus, the group of servers 106 logically grouped as a machine farm may be interconnected using a wide-area network (WAN) connection or a metropolitan-area network (MAN) connection. For example, a machine farm may include servers 106 physically located in different continents or different regions of a continent, country, state, city, campus, or room. Data transmission speeds between servers 106 in the machine farm can be increased if servers 106 are connected using a local-area network (LAN) connection or some form of direct connection. Additionally, a heterogeneous machine farm may include one or more servers 106 operating according to a type of operating system, while one or more other servers execute one or more types of hypervisors rather than operating systems. In these embodiments, hypervisors may be used to emulate virtual hardware, partition physical hardware, virtualize physical hardware, and execute virtual machines that provide access to computing environments, allowing multiple operating systems to run concurrently on a host computer. Native hypervisors may run directly on the host computer. Hypervisors may include VMware ESX/ESXi, manufactured by VMWare, Inc., of Palo Alta, Calif.; the Xen hypervisor, an open source product whose development is overseen by Citrix Systems, Inc. of Fort Lauderdale, Florida; the HYPER-V hypervisors provided by Microsoft, or others. Hosted hypervisors may run within an operating system on a second software level. Examples of hosted hypervisors may include VMWare Workstation and VirtualBox, manufactured by Oracle Corporation of Redwood City, Calif. Additional layers of abstraction may include Container Virtualization and Management infrastructure. Container Virtualization isolates execution of a service to the container while relaying instructions to the machine through one operating system layer per host machine. Container infrastructure may include Docker, an open source product whose development is overseen by Docker, Inc. of San Francisco, Calif.
  • Management of the machine farm may be de-centralized. For example, one or more servers 106 may comprise components, subsystems, and modules to support one or more management services for the machine farm. In one of these embodiments, one or more servers 106 provide functionality for management of dynamic data, including techniques for handling failover, data replication, and increasing the robustness of the machine farm. Each server 106 may communicate with a persistent store and, in some embodiments, with a dynamic store.
  • Server 106 may be a file server, application server, web server, proxy server, appliance, network appliance, gateway, gateway server, virtualization server, deployment server, SSL VPN server, or security system. In one embodiment, a plurality of servers 106 may be in the path between any two communicating servers 106.
  • Referring to FIG. 1B, a cloud computing environment is depicted. A cloud computing environment may provide client 102 with one or more resources provided by a network environment. The cloud computing environment may include one or more clients 102 a-102 n, in communication with cloud 108 over one or more networks 104. Clients 102 may include, e.g., thick clients, thin clients, and zero clients. A thick client may provide at least some functionality even when disconnected from cloud 108 or servers 106. A thin client or zero client may depend on the connection to cloud 108 or server 106 to provide functionality. A zero client may depend on cloud 108 or other networks 104 or servers 106 to retrieve operating system data for the client device 102. Cloud 108 may include back end platforms, e.g., servers 106, storage, server farms or data centers.
  • Cloud 108 may be public, private, or hybrid. Public clouds may include public servers 106 that are maintained by third parties to clients 102 or the owners of the clients. Servers 106 may be located off-site in remote geographical locations as disclosed above or otherwise. Public clouds may be connected to servers 106 over a public network. Private clouds may include private servers 106 that are physically maintained by clients 102 or owners of clients. Private clouds may be connected to servers 106 over a private network 104. Hybrid clouds 109 may include both the private and public networks 104 and servers 106.
  • Cloud 108 may also include a cloud-based delivery, e.g. Software as a Service (SaaS) 110, Platform as a Service (PaaS) 112, and Infrastructure as a Service (IaaS) 114. IaaS may refer to a user renting the user of infrastructure resources that are needed during a specified time period. IaaS provides may offer storage, networking, servers, or virtualization resources from large pools, allowing the users to quickly scale up by accessing more resources as needed. Examples of IaaS include Amazon Web Services (AWS) provided by Amazon, Inc. of Seattle, Wash., Rackspace Cloud provided by Rackspace Inc. of San Antonio, Texas, Google Compute Engine provided by Google Inc. of Mountain View, Calif., or RightScale provided by RightScale, Inc. of Santa Barbara, Calif. PaaS providers may offer functionality provided by IaaS, including, e.g., storage, networking, servers, virtualization, or containerization, as well as additional resources, e.g., the operating system, middleware, or runtime resources. Examples of PaaS include Windows Azure provided by Microsoft Corporation of Redmond, Wash., Google App Engine provided by Google Inc., and Heroku provided by Heroku, Inc. of San Francisco Calif. SaaS providers may offer the resources that PaaS provides, including storage, networking, servers, virtualization, operating system, middleware, or runtime resources. In some embodiments, SaaS providers may offer additional resources including, e.g., data and application resources. Examples of SaaS include Google Apps provided by Google Inc., Salesforce provided by Salesforce.com Inc. of San Francisco, Calif., or Office365 provided by Microsoft Corporation. Examples of SaaS may also include storage providers, e.g. Dropbox provided by Dropbox Inc. of San Francisco, Calif., Microsoft OneDrive provided by Microsoft Corporation, Google Drive provided by Google Inc., or Apple iCloud provided by Apple Inc. of Cupertino, Calif.
  • Clients 102 may access IaaS resources with one or more IaaS standards, including, e.g., Amazon Elastic Compute Cloud (EC2), Open Cloud Computing Interface (OCCI), Cloud Infrastructure Management Interface (CIMI), or OpenStack standards. Some IaaS standards may allow clients access to resources over a Hypertext Transfer Protocol (HTTP) and may use Representational State Transfer (REST) protocol or Simple Object Access Protocol (SOAP). Clients 102 may access PaaS resources with different PaaS interfaces. Some PaaS interfaces use HTTP packages, standard Java APIs, JavaMail API, Java Data Objects (JDO), Java Persistence API (JPA), Python APIs, web integration APIs for different programming languages including, e.g., Rack for Ruby, WSGI for Python, or PSGI for Perl, or other APIs that may be built on REST, HTTP, XML, or other protocols. Clients 102 may access SaaS resources using web-based user interfaces, provided by a web browser (e.g. Google Chrome, Microsoft Internet Explorer, or Mozilla Firefox provided by Mozilla Foundation of Mountain View, Calif.). Clients 102 may also access SaaS resources through smartphone or tablet applications, including e.g., Salesforce Sales Cloud, or Google Drive App. Clients 102 may also access SaaS resources through the client operating system, including e.g. Windows file system for Dropbox.
  • In some embodiments, access to IaaS, PaaS, or SaaS resources may be authenticated. For example, a server or authentication server may authenticate a user via security certificates, HTTPS, or API keys. API keys may include various encryption standards such as, e.g., Advanced Encryption Standard (AES). Data resources may be sent over Transport Layer Security (TLS) or Secure Sockets Layer (SSL).
  • Client 102 and server 106 may be deployed as and/or executed on any type and form of computing device, e.g., a computer, network device or appliance capable of communicating on any type and form of network and performing the operations described herein.
  • FIGS. 1C and 1D depict block diagrams of a computing device 100 useful for practicing an embodiment of client 102 or server 106. As shown in FIGS. 1C and 1D, each computing device 100 includes central processing unit 121, and main memory unit 122. As shown in FIG. 1C, computing device 100 may include storage device 128, installation device 116, network interface 118, and I/O controller 123, display devices 124 a-124 n, keyboard 126 and pointing device 127, e.g., a mouse. Storage device 128 may include, without limitation, operating system 129, software 131, and a software of security awareness system 120. As shown in FIG. 1D, each computing device 100 may also include additional optional elements, e.g., a memory port 103, bridge 170, one or more input/output devices 130 a-130 n (generally referred to using reference numeral 130), and cache memory 140 in communication with central processing unit 121.
  • Central processing unit 121 is any logic circuity that responds to and processes instructions fetched from main memory unit 122. In many embodiments, central processing unit 121 is provided by a microprocessor unit, e.g.: those manufactured by Intel Corporation of Mountain View, Calif.; those manufactured by Motorola Corporation of Schaumburg, Illinois; the ARM processor and TEGRA system on a chip (SoC) manufactured by Nvidia of Santa Clara, Calif.; the POWER7 processor, those manufactured by International Business Machines of White Plains, N.Y.; or those manufactured by Advanced Micro Devices of Sunnyvale, Calif. Computing device 100 may be based on any of these processors, or any other processor capable of operating as described herein. Central processing unit 121 may utilize instruction level parallelism, thread level parallelism, different levels of cache, and multi-core processors. A multi-core processor may include two or more processing units on a single computing component. Examples of multi-core processors include the AMD PHENOM IIX2, INTEL CORE i5 and INTEL CORE i7.
  • Main memory unit 122 may include one or more memory chips capable of storing data and allowing any storage location to be directly accessed by microprocessor 121. Main memory unit 122 may be volatile and faster than storage 128 memory. Main memory units 122 may be Dynamic Random-Access Memory (DRAM) or any variants, including static Random-Access Memory (SRAM), Burst SRAM or SynchBurst SRAM (BSRAM), Fast Page Mode DRAM (FPM DRAM), Enhanced DRAM (EDRAM), Extended Data Output RAM (EDO RAM), Extended Data Output DRAM (EDO DRAM), Burst Extended Data Output DRAM (BEDO DRAM), Single Data Rate Synchronous DRAM (SDR SDRAM), Double Data Rate SDRAM (DDR SDRAM), Direct Rambus DRAM (DRDRAM), or Extreme Data Rate DRAM (XDR DRAM). In some embodiments, main memory 122 or storage 128 may be non-volatile; e.g., non-volatile read access memory (NVRAM), flash memory non-volatile static RAM (nvSRAM), Ferroelectric RAM (FeRAM), Magnetoresistive RAM (MRAM), Phase-change memory (PRAM), conductive-bridging RAM (CBRAM), Silicon-Oxide-Nitride-Oxide-Silicon (SONOS), Resistive RAM (RRAM), Racetrack, Nano-RAM (NRAM), or Millipede memory. Main memory 122 may be based on any of the above described memory chips, or any other available memory chips capable of operating as described herein. In the embodiment shown in FIG. 1C, the processor 121 communicates with main memory 122 via system bus 150 (described in more detail below). FIG. 1D depicts an embodiment of computing device 100 in which the processor communicates directly with main memory 122 via memory port 103. For example, in FIG. 1D main memory 122 may be DRDRAM.
  • FIG. 1D depicts an embodiment in which the main processor 121 communicates directly with cache memory 140 via a secondary bus, sometimes referred to as a backside bus. In other embodiments, main processor 121 communicates with cache memory 140 using system bus 150. Cache memory 140 typically has a faster response time than main memory 122 and is typically provided by SRAM, BSRAM, or EDRAM. In the embodiment shown in FIG. 1D, the processor 121 communicates with various I/O devices 130 via local system bus 150. Various buses may be used to connect central processing unit 121 to any of I/O devices 130, including a PCI bus, a PCI-X bus, or a PCI-Express bus, or a NuBus. For embodiments in which the I/O device is video display 124, the processor 121 may use an Advanced Graphic Port (AGP) to communicate with display 124 or the I/O controller 123 for display 124. FIG. 1D depicts an embodiment of computer 100 in which main processor 121 communicates directly with I/O device 130 b or other processors 121′ via HYPERTRANSPORT, RAPIDIO, or INFINIBAND communications technology. FIG. 1D also depicts an embodiment in which local busses and direct communication are mixed: the processor 121 communicates with I/O device 130 a using a local interconnect bus while communicating with I/O device 130 b directly.
  • A wide variety of I/O devices 130 a-130 n may be present in computing device 100. Input devices may include keyboards, mice, trackpads, trackballs, touchpads, touch mice, multi-touch touchpads and touch mice, microphones, multi-array microphones, drawing tablets, cameras, single-lens reflex cameras (SLR), digital SLR (DSLR), CMOS sensors, accelerometers, infrared optical sensors, pressure sensors, magnetometer sensors, angular rate sensors, depth sensors, proximity sensors, ambient light sensors, gyroscopic sensors, or other sensors. Output devices may include video displays, graphical displays, speakers, headphones, inkjet printers, laser printers, and 3D printers.
  • Devices 130 a-130 n may include a combination of multiple input or output devices, including, e.g., Microsoft KINECT, Nintendo Wiimote for the WII, Nintendo WII U GAMEPAD, or Apple iPhone. Some devices 130 a-130 n allow gesture recognition inputs through combining some of the inputs and outputs. Some devices 130 a-130 n provide for facial recognition which may be utilized as an input for different purposes including authentication and other commands. Some devices 130 a-130 n provide for voice recognition and inputs, including, e.g., Microsoft KINECT, SIRI for iPhone by Apple, Google Now or Google Voice Search, and Alexa by Amazon.
  • Additional devices 130 a-130 n have both input and output capabilities, including, e.g., haptic feedback devices, touchscreen displays, or multi-touch displays. Touchscreen, multi-touch displays, touchpads, touch mice, or other touch sensing devices may use different technologies to sense touch, including, e.g., capacitive, surface capacitive, projected capacitive touch (PCT), in cell capacitive, resistive, infrared, waveguide, dispersive signal touch (DST), in-cell optical, surface acoustic wave (SAW), bending wave touch (BWT), or force-based sensing technologies. Some multi-touch devices may allow two or more contact points with the surface, allowing advanced functionality including, e.g., pinch, spread, rotate, scroll, or other gestures. Some touchscreen devices, including, e.g., Microsoft PIXELSENSE or Multi-Touch Collaboration Wall, may have larger surfaces, such as on a table-top or on a wall, and may also interact with other electronic devices. Some I/O devices 130 a-130 n, display devices 124 a-124 n or group of devices may be augmented reality devices. The I/O devices may be controlled by I/O controller 123 as shown in FIG. 1C. The I/O controller may control one or more I/O devices, such as, e.g., keyboard 126 and pointing device 127, e.g., a mouse or optical pen. Furthermore, an I/O device may also provide storage and/or installation medium 116 for computing device 100. In still other embodiments, computing device 100 may provide USB connections (not shown) to receive handheld USB storage devices. In further embodiments, a I/O device 130 may be a bridge between the system bus 150 and an external communication bus, e.g. a USB bus, a SCSI bus, a FireWire bus, an Ethernet bus, a Gigabit Ethernet bus, a Fiber Channel bus, or a Thunderbolt bus.
  • In some embodiments, display devices 124 a-124 n may be connected to I/O controller 123. Display devices may include, e.g., liquid crystal displays (LCD), thin film transistor LCD (TFT-LCD), blue phase LCD, electronic papers (e-ink) displays, flexile displays, light emitting diode displays (LED), digital light processing (DLP) displays, liquid crystal on silicon (LCOS) displays, organic light-emitting diode (OLED) displays, active-matrix organic light-emitting diode (AMOLED) displays, liquid crystal laser displays, time-multiplexed optical shutter (TMOS) displays, or 3D displays. Examples of 3D displays may use, e.g. stereoscopy, polarization filters, active shutters, or auto stereoscopy. Display devices 124 a-124 n may also be a head-mounted display (HMD). In some embodiments, display devices 124 a-124 n or the corresponding I/O controllers 123 may be controlled through or have hardware support for OPENGL or DIRECTX API or other graphics libraries.
  • In some embodiments, computing device 100 may include or connect to multiple display devices 124 a-124 n, which each may be of the same or different type and/or form. As such, any of I/O devices 130 a-130 n and/or the I/O controller 123 may include any type and/or form of suitable hardware, software, or combination of hardware and software to support, enable or provide for the connection and use of multiple display devices 124 a-124 n by computing device 100. For example, computing device 100 may include any type and/or form of video adapter, video card, driver, and/or library to interface, communicate, connect, or otherwise use display devices 124 a-124 n. In one embodiment, a video adapter may include multiple connectors to interface to multiple display devices 124 a-124 n. In other embodiments, computing device 100 may include multiple video adapters, with each video adapter connected to one or more of display devices 124 a-124 n. In some embodiments, any portion of the operating system of computing device 100 may be configured for using multiple displays 124 a-124 n. In other embodiments, one or more of the display devices 124 a-124 n may be provided by one or more other computing devices 100 a or 100 b connected to computing device 100, via network 104. In some embodiments, software may be designed and constructed to use another computer's display device as second display device 124 a for computing device 100. For example, in one embodiment, an Apple iPad may connect to computing device 100 and use the display of the device 100 as an additional display screen that may be used as an extended desktop. One ordinarily skilled in the art will recognize and appreciate the various ways and embodiments that computing device 100 may be configured to have multiple display devices 124 a-124 n.
  • Referring again to FIG. 1C, computing device 100 may comprise storage device 128 (e.g. one or more hard disk drives or redundant arrays of independent disks) for storing an operating system or other related software, and for storing application software programs such as any program related to security awareness system 120. Examples of storage device 128 include, e.g., hard disk drive (HDD); optical drive including CD drive, DVD drive, or BLU-RAY drive; solid-state drive (SSD); USB flash drive; or any other device suitable for storing data. Some storage devices may include multiple volatile and non-volatile memories, including, e.g., solid state hybrid drives that combine hard disks with solid state cache. Some storage devices 128 may be non-volatile, mutable, or read-only. Some storage devices 128 may be internal and connect to computing device 100 via bus 150. Some storage devices 128 may be external and connect to computing device 100 via a I/O device 130 that provides an external bus. Some storage devices 128 may connect to computing device 100 via network interface 118 over network 104, including, e.g., the Remote Disk for MACBOOK AIR by Apple. Some client devices 100 may not require a non-volatile storage device 128 and may be thin clients or zero clients 102. Some storage devices 128 may also be used as an installation device 116 and may be suitable for installing software and programs. Additionally, the operating system and the software can be run from a bootable medium, for example, a bootable CD, e.g. KNOPPIX, a bootable CD for GNU/Linux that is available as a GNU/Linux distribution from knoppix.net.
  • Computing device 100 (e.g., client device 102) may also install software or application from an application distribution platform. Examples of application distribution platforms include the App Store for iOS provided by Apple, Inc., the Mac App Store provided by Apple, Inc., GOOGLE PLAY for Android OS provided by Google Inc., Chrome Webstore for CHROME OS provided by Google Inc., and Amazon Appstore for Android OS and KINDLE FIRE provided by Amazon.com, Inc. An application distribution platform may facilitate installation of software on client device 102. An application distribution platform may include a repository of applications on server 106 or cloud 108, which clients 102 a-102 n may access over a network 104. An application distribution platform may include application developed and provided by various developers. A user of client device 102 may select, purchase and/or download an application via the application distribution platform.
  • Furthermore, computing device 100 may include a network interface 118 to interface to network 104 through a variety of connections including, but not limited to, standard telephone lines LAN or WAN links (e.g., 802.11, T1, T3, Gigabit Ethernet, InfiniBand), broadband connections (e.g., ISDN, Frame Relay, ATM, Gigabit Ethernet, Ethernet-over-SONET, ADSL, VDSL, BPON, GPON, fiber optical including FiOS), wireless connections, or some combination of any or all of the above. Connections can be established using a variety of communication protocols (e.g., TCP/IP, Ethernet, ARCNET, SONET, SDH, Fiber Distributed Data Interface (FDDI), IEEE 802.11a/b/g/n/ac CDMA, GSM, WiMAX, and direct asynchronous connections). In one embodiment, computing device 100 communicates with other computing devices 100′ via any type and/or form of gateway or tunneling protocol e.g. Secure Socket Layer (SSL) or Transport Layer Security (TLS), or the Citrix Gateway Protocol manufactured by Citrix Systems, Inc. Network interface 118 may comprise a built-in network adapter, network interface card, PCMCIA network card, EXPRESSCARD network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing computing device 100 to any type of network capable of communication and performing the operations described herein.
  • Computing device 100 of the sort depicted in FIGS. 1B and 1C may operate under the control of an operating system, which controls scheduling of tasks and access to system resources. Computing device 100 can be running any operating system such as any of the versions of the MICROSOFT WINDOWS operating systems, the different releases of the Unix and Linux operating systems, any version of the MAC OS for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, any operating systems for mobile computing devices, or any other operating system capable of running on the computing device and performing the operations described herein. Typical operating systems include, but are not limited to: WINDOWS 2000, WINDOWS Server 2012, WINDOWS CE, WINDOWS Phone, WINDOWS XP, WINDOWS VISTA, and WINDOWS 7, WINDOWS RT, WINDOWS 8 and WINDOW 10, all of which are manufactured by Microsoft Corporation of Redmond, Wash.; MAC OS and iOS, manufactured by Apple, Inc.; and Linux, a freely-available operating system, e.g. Linux Mint distribution (“distro”) or Ubuntu, distributed by Canonical Ltd. of London, United Kingdom; or Unix or other Unix-like derivative operating systems; and Android, designed by Google Inc., among others. Some operating systems, including, e.g., the CHROME OS by Google Inc., may be used on zero clients or thin clients, including, e.g., CHROMEBOOKS.
  • Computer system 100 can be any workstation, telephone, desktop computer, laptop or notebook computer, netbook, ULTRABOOK, tablet, server, handheld computer, mobile telephone, smartphone or other portable telecommunications device, media playing device, a gaming system, mobile computing device, or any other type and/or form of computing, telecommunications or media device that is capable of communication. Computer system 100 has sufficient processor power and memory capacity to perform the operations described herein. In some embodiments, computing device 100 may have different processors, operating systems, and input devices consistent with the device. The Samsung GALAXY smartphones, e.g., operate under the control of Android operating system developed by Google, Inc. GALAXY smartphones receive input via a touch interface.
  • In some embodiments, computing device 100 is a gaming system. For example, the computer system 100 may comprise a PLAYSTATION 3, or PERSONAL PLAYSTATION PORTABLE (PSP), PLAYSTATION VITA, PLAYSTATION 4, or a PLAYSTATION 4 PRO device manufactured by the Sony Corporation of Tokyo, Japan, or a NINTENDO DS, NINTENDO 3DS, NINTENDO WII, NINTENDO WII U, or a NINTENDO SWITCH device manufactured by Nintendo Co., Ltd., of Kyoto, Japan, or an XBOX 360 device manufactured by Microsoft Corporation.
  • In some embodiments, computing device 100 is a digital audio player such as the Apple IPOD, IPOD Touch, and IPOD NANO lines of devices, manufactured by Apple Computer of Cupertino, Calif. Some digital audio players may have other functionality, including, e.g., a gaming system or any functionality made available by an application from a digital application distribution platform. For example, the IPOD Touch may access the Apple App Store. In some embodiments, computing device 100 is a portable media player or digital audio player supporting file formats including, but not limited to, MP3, WAV, M4A/AAC, WMA Protected AAC, AIFF, Audible audiobook, Apple Lossless audio file formats and .mov, .m4v, and .mp4 MPEG-4 (H.264/MPEG-4 AVC) video file formats.
  • In some embodiments, computing device 100 is a tablet e.g. the IPAD line of devices by Apple; GALAXY TAB family of devices by Samsung; or KINDLE FIRE, byAmazon.com, Inc. of Seattle, Wash. In other embodiments, computing device 100 is an eBook reader, e.g. the KINDLE family of devices by Amazon.com, or NOOK family of devices by Barnes & Noble, Inc. of New York City, N.Y.
  • In some embodiments, communications device 102 includes a combination of devices, e.g. a smartphone combined with a digital audio player or portable media player. For example, one of these embodiments is a smartphone, e.g. the iPhone family of smartphones manufactured by Apple, Inc.; a Samsung GALAXY family of smartphones manufactured by Samsung, Inc; or a Motorola DROID family of smartphones. In yet another embodiment, communications device 102 is a laptop or desktop computer equipped with a web browser and a microphone and speaker system, e.g. a telephony headset. In these embodiments, communications devices 102 are web-enabled and can receive and initiate phone calls. In some embodiments, a laptop or desktop computer is also equipped with a webcam or other video capture device that enables video chat and video call.
  • In some embodiments, the status of one or more machines 102, 106 in network 104 is monitored, generally as part of network management. In one of these embodiments, the status of a machine may include an identification of load information (e.g., the number of processes on the machine, CPU, and memory utilization), of port information (e.g., the number of available communication ports and the port addresses), or of session status (e.g., the duration and type of processes, and whether a process is active or idle). In another of these embodiments, the information may be identified by a plurality of metrics, and the plurality of metrics can be applied at least in part towards decisions in load distribution, network traffic management, and network failure recovery as well as any aspects of operations of the present solution described herein. Aspects of the operating environments and components described above will become apparent in the context of the systems and methods disclosed herein.
  • B. Systems and Methods for Improving Assessment of Security Risk Based on Personal Internet Account Data
  • The following describes systems and methods for improving assessment of security risks that users pose to an organization based on their personal internet account data.
  • An organization does not usually have access to information related to a users' personal domain such as a users' personal email address. Also, the organization may not have any measure to detect the security awareness behavior of users in their personal domain. As a result, the full security risk that users pose to the organization may not be known to it. Furthermore, even if the organization has access to personal information of the users, it would most likely be available within a particular department, for example, a Human Resources (HR) department, and not available to any other department within the organization. In an example, the personal information of the users may not be available to a system administrator or an Information Technology (IT) department of the organization. The system administrator may be a professional (or a team of professionals) managing organizational cybersecurity aspects. The system administrator may oversee and manage IT systems of the organization.
  • Also, a significant number of organization email breaches occur when users use their organization email address for registrations on websites in their personal domain. In an example, users may use their organization email address to register to entertainment and media websites, for example, for gaming, travel, booking restaurants, and other personal purposes. Such usage exposes the users' organization email address to risk, and it may be subjected to attacks or may become compromised. Accordingly, when a user registers for a website in a personal domain using organization login credentials (e.g., one or more of an email address, a username, an email, or password) the user may unintentionally expose the organization login credentials to a risk of hijacking that may provide to an attacker access to organizational data.
  • FIG. 2 depicts an implementation of some of an architecture of an implementation of system 200 for determining a personal risk score of a user of an organization based on personal information of the user, according to some embodiments.
  • System 200 may include security awareness system 202, user device 204, email server 206, one or more breach databases 208 1-N, and network 210 enabling communication between the system components for information exchange. Network 210 may be an example or instance of network 104, details of which are provided with reference to FIG. 1A and its accompanying description.
  • According to some embodiments, security awareness system 202 may be implemented in a variety of computing systems, such as a mainframe computer, a server, a network server, a laptop computer, a desktop computer, a notebook, a workstation, and any other computing system. In an implementation, security awareness system 202 may be implemented in a server, such as server 106 shown in FIG. 1A. In some implementations, security awareness system 202 may be implemented by a device, such as computing device 100 shown in FIGS. 1C and 1D. In some embodiments, security awareness system 202 may be implemented across a server cluster, thereby, tasks performed by security awareness system 202 may be performed by the plurality of servers. These tasks may be allocated among the server cluster by an application, a service, a daemon, a routine, or other executable logic for task allocation.
  • In one or more embodiments, security awareness system 202 may facilitate cybersecurity awareness training, for example, via simulated phishing campaigns, computer-based trainings, remedial trainings, and risk score generation and tracking. A simulated phishing campaign is a technique of testing a user to see whether the user is likely to recognize a true malicious phishing attack and act appropriately upon receiving the malicious phishing attack. In some embodiments, the user may be an employee of the organization, a customer, a vendor, or anyone associated with the organization. In some embodiments, the user may be an end-customer/consumer or a patron using the goods and/or services of the organization. In an implementation, security awareness system 202 may execute the simulated phishing campaign by sending out one or more simulated phishing messages periodically or occasionally to the users and observe responses of the users to such simulated phishing messages. A simulated phishing message may mimic a real phishing message and appear genuine to entice a user to respond/interact with the simulated phishing message. The simulated phishing message may include links, attachments, macros, or any other simulated phishing threat that resembles a real phishing threat. In response to a user interaction with the simulated phishing message, for example, if the user clicks on a link (i.e., a simulated phishing link), the user may be provided with security awareness training. If and how the user interacts with the simulated phishing message may be logged and may impact a risk score of the user, a risk score of a team of which the user is part of, a risk score of the user's organization, and/or a risk score of an industry to which the user's organization belongs.
  • In some implementations, security awareness system 202 may be owned or managed or otherwise associated with an organization or any entity authorized thereof. In an implementation, security awareness system 202 may be managed by a system administrator. The system administrator may oversee and manage security awareness system 202 to ensure cybersecurity goals of the organization are met. For example, the system administrator may oversee Information Technology (IT) systems of the organization for managing simulated phishing campaigns and any other element within security awareness system 202. In an example, security awareness system 202 may be a Computer Based Security Awareness Training (CBSAT) system that performs security services such as performing simulated phishing campaigns on a user or a set of users of an organization as a part of security awareness training.
  • Referring again to FIG. 2 , in some embodiments, user device 204 may be any device used by the user. User device 204 as disclosed, may be any computing device, such as a desktop computer, a laptop, a tablet computer, a mobile device, a Personal Digital Assistant (PDA) or any other computing device. In an implementation, user device 204 may be a device, such as client device 102 shown in FIG. 1A and FIG. 1B. User device 204 may be implemented by a device, such as computing device 100 shown in FIG. 1C and FIG. 1D.
  • Further, email server 206 may be any server capable of handling and delivering emails over network 210 using one or more standard email protocols, such as Post Office Protocol 3 (POP3), Internet Message Access Protocol (IMAP), Simple Message Transfer Protocol (SMTP), and Multipurpose Internet Mail Extension (MIME) Protocol. Email server 206 may be a standalone server or a part of an organization server. Email server 206 may be implemented using, for example, Microsoft® Exchange Server, or HCL Domino®. In an implementation, email server 206 may be server 106 shown in FIG. 1A. Email server 206 may be implemented by a device, such as computing device 100 shown in FIGS. 1C and 1D. In some embodiments, email server 206 may be implemented as a part of a server cluster. In some embodiments, email server 206 may be implemented across a plurality of servers, thereby, tasks performed by email server 206 may be performed by the plurality of servers. These tasks may be allocated among the server cluster by an application, a service, a daemon, a routine, or other executable logic for task allocation.
  • According to some embodiments, one or more breach databases 208 1-N may be dynamic databases that include public databases and/or private databases. One or more breach databases 208 1-N may include information related to user login credentials of websites which have been breached. Examples of user login credentials may include a username, an email address and/or a password. A username is a unique combination of characters, such as letters of the alphabet and/or numbers and/or non-alphanumeric symbols, that identify a specific user. The user may gain access to a website using the user login credentials. In an example implementation, security awareness system 202 may determine whether user login credentials including a username and/or an email address is/are associated with a data breach if the username and/or the email address is found in one or more breach databases 208 1-N. In some embodiments, information related to the user login credentials of the users stored in one or more breach databases 208 1-N may be periodically or dynamically updated as required.
  • According to some embodiments, security awareness system 202 may include processor 212 and memory 214. For example, processor 212 and memory 214 of security awareness system 202 may be CPU 121 and main memory 122, respectively, as shown in FIGS. 1C and 1D. According to an embodiment, security awareness system 202 may include verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226. In an implementation, verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226 may be coupled to processor 212 and memory 214. In some embodiments, verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226 amongst other units, may include routines, programs, objects, components, data structures, etc., which may perform particular tasks or implement particular abstract data types. Verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulate signals based on operational instructions.
  • In some embodiments, verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226 may be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit may comprise a computer, a processor, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit may be a general-purpose processor that executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit may be dedicated to performing the required functions. In some embodiments, verification unit 216, exposure check unit 218, security audit unit 220, risk score calculator 222, remediation unit 224, and detection unit 226 may be machine-readable instructions which, when executed by a processor/processing unit, perform any of the desired functionalities. The machine-readable instructions may be stored on an electronic memory device, hard disk, optical disk, or other machine-readable storage medium or non-transitory medium. In an implementation, the machine-readable instructions may also be downloaded to the storage medium via a network connection. In an example, the machine-readable instructions may be stored in memory 214.
  • In some embodiments, security awareness system 202 may include password storage 228, user personal information storage 230, risk score storage 232, and website information storage 234. In an implementation, password storage 228 may include information about organization passwords of the users. User personal information storage 230 may store information related to personal accounts of the users. In an example, information related to personal accounts of the users may include personal email addresses, usernames, passwords, and/or other information from the users' personal domain. According to an implementation, the users may voluntarily provide the information related to their personal accounts. In an example, user personal information storage 230 may also store information about past or previous organizations of the user, such as previous organization email addresses. Further, risk score storage 232 may include security awareness profiles of the users and risk scores of the users (in some examples, the security awareness profile of the user may include of a risk score of the user). In an example, a security awareness profile of a user may include information about the security awareness of the user and other information which is relevant for assessing the security awareness of the user. A risk score of a user may include a representation of the susceptibility of the user to a malicious attack. Also, the risk score for a user may quantify a cybersecurity risk that the user poses to an organization. The risk score may also quantify the level of risk for a group of users, the organization, an industry to which the organization belongs, a geography, and any other categorization. In an example, the risk score of the user may be modified based on the user's responses to simulated phishing messages, assessed user behavior, breached user information, completion of training by the user, a current position of the user in the organization, a size of a network of the user, an amount of time the user has held the current position in the organization, and/or any other attribute that can be associated with the user. In an implementation, a higher risk score of the user indicates that a higher security risk is associated with the user and a lower risk score indicates a lower security risk and better security awareness.
  • According to an implementation, website information storage 234 may store information related to personal domain websites. In an example, website information storage 234 may store information about login pages for known or popular personal domain websites, email addresses associated with messages sent by those websites, or examples of registration email validation messages and promotional messages used by personal domain websites. In an example, a registration email validation message is a message that is sent by a website (for example, a personal domain website) to an email address that a user input when registering with the website. The registration email validation message may include a link for the user to click to validate that he or she is the owner of the email address and may include keywords such as “verify your email address,” “confirm your email,” and “activate your account.” Further, in an example, a promotional message may be an email message such as a weekly newsletter, a sale promotion email, and other promotional messages that a business distributes to promote their products, services, offers, campaigns, etc. In an example, the promotional messages from various personal domain websites may have specific characteristics in common. In an example, the promotional messages may include an unsubscribe link (for example, “Click here to unsubscribe from these emails”), discount details (for example, “up to 40% off”) and other such characteristics. Further, promotional messages may include keywords such as “discount,” “unsubscribe,” “sale,” “offer,” and “hurry.”
  • Information about organization passwords of the users stored in password storage 228, personal information related to the users stored in user personal information storage 230, security awareness profiles of the users and risk scores of the users stored in risk score storage 232, and information about personal websites stored in website information storage 234 may be periodically or dynamically updated as required.
  • Referring again to FIG. 2 , in some embodiments, user device 204 may be any device used by a user. The user may be an employee of an organization or any entity. According to some embodiments, user device 204 may include processor 236 and memory 238. In an example, processor 236 and memory 238 of user device 204 may be CPU 121 and main memory 122, respectively, as shown in FIGS. 1C and 1D. User device 204 may also include user interface 240 such as a keyboard, a mouse, a touch screen, a haptic sensor, voice-based input unit, or any other appropriate user interface. It shall be appreciated that such components of user device 204 may correspond to similar components of computing device 100 in FIGS. 1C and 1D, such as keyboard 126, pointing device 127, I/O devices 130 a-n and display devices 124 a-n. User device 204 may also include display 242, such as a screen, a monitor connected to the device in any manner, or any other appropriate display. In an implementation, user device 204 may display received content (for example, emails) for the user using display 242 and is able to accept user interaction via user interface 240 responsive to the displayed content.
  • In some embodiments, user device 204 may include email client 244. In one example implementation, email client 244 may be an application installed on user device 204. In another example implementation, email client 244 may be an application that can be accessed over network 210 through a browser without requiring to be installed on user device 204. In an implementation, email client 244 may be any application capable of composing, sending, receiving, and reading email messages. For example, email client 244 may be an instance of an application, such as Microsoft Outlook™ application, HCL Notes application, IBM® Lotus Notes® application, Apple® Mail application, Gmail° application, or any other known or custom email application. In an example, a user of user device 204 may be mandated to download and install email client 244 by the organization. In another example, email client 244 may be provided by the organization as default. In some examples, a user of user device 204 may select, purchase and/or download email client 244, through for example, an application distribution platform. The term “application” as used herein may refer to one or more applications, services, routines, or other executable logic or instructions.
  • According to one or more embodiments, a user of the organization may be aware that his or her actions outside the organization may affect the security of the organization. In some instances, the user may be willing to improve his or her security awareness behavior. In an implementation, the user, or another person, for example, the users' manager in the organization, may place a request to security awareness system 202 for registration of the user's personal information. In examples, a user, a user's manager, a system administrator or a security awareness system may initiate a request from security awareness system 202 to the user's device 204 causing the user's device 204 to display a request to the user, wherein the request to the user requests that the user register personal information with the security awareness system or the request to the user requests that the user sends personal information to the security awareness system. In one example, the user may initiate the request by opting in for personal information registration. In an implementation, security awareness system 202 may receive a request for registration of the personal information of the user, for example from a user's manager or a system administrator or other company official. In some implementations, security awareness system 202 may initiate the request for registration of the personal information of the user on behalf of the user. In some implementations, security awareness system 202 may prompt or provide an option to the user to register their personal information. According to an example, a user may provide personal information to security awareness system 202 voluntarily or in response to a prompt. Security awareness system 202 may receive the personal information of the user, for example, one or more of a personal email address, a username, and a personal password or a password associated with a personal domain. In examples, “registration” of personal information of the user at the security awareness system 202 is considered performed merely by the security awareness system 202 receiving personal information of the user.
  • In order to verify that the personal information provided by the user or for the user belongs to the user, security awareness system 202 may initiate verification of the personal information through verification unit 216. In a scenario where the user's personal information includes an email address, verification unit 216 may verify that the personal email address is within the personal domain of the user. According to an implementation, verification unit 216 may send a confirmation email to the personal email address provided by the user or for the user to verify whether the personal email address is controlled by the user. For example, a confirmation email may include a one-time code for the user to input into security awareness system 202 confirming access to the personal email address. In examples, the one-time code may be valid for only one login session or transaction or may be valid only for a short period of time. In an example, the one-time code may be valid for a period of time, such as 2 minutes. In some examples, the confirmation email may include a time-sensitive link that has to be followed or clicked within a specified time period (for example, 15 minutes) to complete the verification process. In some examples, the user may input the one-time code into security awareness system 202 or follow the provided link within the specified time period to validate the ownership of, or access to the personal email address. The input of the correct one-time code or following of the supplied link within the specified time period enables verification unit 216 to verify that the personal email address provided for or by the user and/or the personal information provided for or by the user is owned or controlled by the user. Accordingly, verification unit 216 prevents a user from, for example, registering with another user's email address. Other methods to confirm or verify that personal information registered with or sent to security awareness system 202 belongs to the user that are not discussed here are contemplated herein.
  • According to some embodiments, verification unit 216 may be configured to register and store personal information of the user in response to verification that the personal email address provided for or by the user is within the personal domain of the user. In another embodiment, verification unit 216 may register and store personal information provided for or by the user without verifying that the personal information belongs to the user. Verification unit 216 may be configured to store personal information of the user in user personal information storage 230. In an implementation, for security and privacy reasons and/or to comply with respective privacy laws of corresponding countries, verification unit 216 may store personal information of the user in an encrypted, hashed, or obfuscated form. In some examples, verification unit 216 may store personal information of the user in a plain text or non-encrypted form. In some implementations, verification unit 216 may establish a link between a security awareness profile of the user and personal information of the user.
  • After personal information of the user is registered and/or received and/or stored, verification unit 216 may be configured to notify the system administrator that the user has provided or sent personal information to security awareness system 202 or has registered personal information with security awareness system 202. In some implementations, verification unit 216 may provide levels of visibility of personal information of a user to the system administrator. In an example, personal information of one or more users may be fully visible to the system administrator. In some examples, personal information of the one or more users may be partially obscured. For example, an email address “user08@gmail.com” may be displayed to the system administrator as u*****@*****.com or another combination of obfuscated and actual characters. In some examples, the information may be unavailable to the system administrator. In some scenarios, the system administrator may be given an indication confirming that the user has registered his or her personal information with security awareness system 202. Further, in some examples, the personal information may be unavailable to the system administrator, and the system administrator is not notified when the personal information is provided by the user. In some embodiments, access to passwords entered into security awareness system 202 may be restricted to non-humans, such as Artificial Intelligence (AI), operating logic, and other processing systems.
  • According to some embodiments, security awareness system 202 may perform an exposure check and/or a security audit on the personal information of the user. In an implementation, exposure check unit 218 is configured to perform the exposure check of the personal information of the user by searching for user's personal information or credentials in one or more breach databases 208 1-N. For example, exposure check unit 218 may check if one or more usernames and/or email addresses registered by the user is found in the one or more breach databases 208 1-N thereby indicating that the one or more usernames and/or email addresses have been exposed in a security breach. In an implementation, exposure check unit 218 may use at least one of an email address or a username in the personal information to search for breached user information in one or more breach databases 208 1-N. In an example, exposure check unit 218 may separate the email address account name (user) from its domain name (@company.com) and perform an exposure check on the account name. For example, if the email address provided by the user is “user08@gmail.com,” then exposure check unit 218 may separate “user08” from “user08@gmail.com” and perform an exposure check using the account name “user08”.
  • In implementations where the user has provided one or more passwords during registration, exposure check unit 218 may check whether those passwords are associated with any known data breach. In an example, exposure check unit 218 may determine that a password is associated with a known data breach if the password is detected in one or more breach databases 208 1-N. In some examples, exposure check unit 218 may query one or more breach database 208 1-N to determine if the one or more passwords provided by the user have been compromised in a data breach. In some examples, exposure check unit 218 may provide the passwords in a query to one or more breach database 208 1-N.
  • In some embodiments, security audit unit 220 may be configured to perform a security audit of the personal information of the user. In an implementation, security audit unit 220 may perform the security audit by assessing a strength of the one or more stored personal passwords from the personal information and evaluating whether the passwords used for personal information comply with policies that the organization has for password strength. In some examples, security audit unit 220 may assess the personal passwords to determine if there is password reusage or password sharing. In an example, password reuse refers to a same user using the same password to log in to more than one account and password sharing refers to a scenario where a password of a user of an organization is identified as identical to or similar to a password of another user of the organization. In an embodiment, security audit unit 220 may assess the strength of the one or more personal passwords based on some standards, such as National Institute of Standards and Technology (NIST) standards provided below.
    • a) A minimum of eight characters and a maximum length of at least 64 characters.
    • b) The ability to use all special characters but no special requirement to use them.
    • c) Restrict sequential and repetitive characters (e.g., 12365 or aaaaaa).
    • d) Restrict context specific passwords (e.g., the name of the site, etc.).
    • e) Restrict commonly used passwords (e.g., p@ssw0rd, etc.) and dictionary words.
    • f) Restrict passwords that match those obtained from previous breaches.
  • In an implementation, security audit unit 220 may compare the one or more registered personal passwords and identify reused passwords or derived passwords. In an example, “R@31F”, “password1”, “Welcome!”, “password2” and “PaSsWoRd” may be poor personal passwords provided by the user. A reused password may be a password that has been used previously or a password having similarity to a certain degree to a password that has been used previously. In an implementation, security audit unit 220 may identify “password1”, “password2”, and “PaSsWoRd” as reused when compared to the password “password”. Security audit unit 220 may use one or more tools to create permutations of one or more passwords to use as search terms. An example of one such tool is the “Bad Password Generator” tool, available via the website of bad.pw and created by Harold Zang, referred to as SpZ (at spz.io).
  • In some embodiments, security audit unit 220 may perform a search for the registered personal passwords within password storage 228 to identify incidents of password sharing. In an example, security audit unit 220 may compare the personal passwords with other passwords within password storage 228. In an implementation, security audit unit 220 may search within the organization's Active Directory, or other corporate databases, for a match to passwords that the user is using within the organization. In an implementation, one or more breach databases 208 1-N and password storage 228 may be continually or periodically monitored for similarity or match to the user's personal information. In an embodiment, the results of the exposure check and/or the results of the security audit may be provided to the system administrator.
  • According to an embodiment, security audit unit 220 may provide the user with a report on threats, breaches, and poor password hygiene associated with a personal domain as an incentive for registering his or her personal information with the organization, thus enabling the user to take appropriate actions to protect their personal information. In an example, security audit unit 220 may generate the report based on information determined by exposure check unit 218 and/or security audit unit 220 and in a further example, security audit unit 220 may generate the report based on searching breach data sites, such as “https://haveibeenpwned.com/” and “https://spycloud.com/” and dark web sources. In an implementation, security audit unit 220 may apply some or all of the security audit process to create a report for the user. In an example, the report may enable the user to gain a greater understanding of the risks that activities in the personal domain create for his or her personal information. Further, the user may be motivated to share further personal information with the organization so that the user can be informed about the risks associated with activities in the personal domain.
  • In an implementation, risk score calculator 222 may be configured to calculate one or more risk scores for the user. According to an embodiment, a non-exhaustive list of examples of individual risk scores include a personal risk score and an organization risk score. Other examples of the individual risk scores that are not discussed here are contemplated herein. In an example, an organization risk score for a user is a component of the risk score which is attributed to data held within the organization as a part of the ongoing employment of the user, and a personal risk score for the user is a component of the risk score which is attributed to the user's personal information and habits/behaviors within the personal domain. In an example, the personal risk score may also reflect a willingness of the user to provide the personal information to the organization that the user is not obliged to provide. In an example, more or fewer individual risk scores may be enabled by security awareness system 202. In examples where more than one risk score is contemplated, risk score calculator 222 may calculate each risk score individually or in another example, risk score calculator 222 may calculate a single, overall risk score for the user taking into account contributions from each area of risk. In some embodiments, risk score calculator 222 may be configured to calculate or adjust a personal risk score of the user according to an analysis of the user's personal information. In an example, when the user registers his or her personal information with security awareness system 202, risk score calculator 222 may set the personal risk score of the user at a level that indicates an action of opting-in. According to an embodiment, the amount of personal information that the user provides may affect the personal risk score of the user. For example, if the user provides two email addresses, then the personal risk score for the user may be set lower than if the user had provided a single email address. Further, in an example, if the user provides password information associated with the email address then the personal risk score may be set lower than if the user had provided only the email address. Many such combinations are contemplated herein to set or modify the personal risk score.
  • In an implementation, there may be multiple levels of access to the user's personal risk score. In an example, both the personal risk score of the user and the organization risk score of the user's organization may be visible to the system administrator. In some examples, the organization risk score of the user's organization may be visible to the system administrator as a risk score separate from the personal risk score of the user. In some examples, the personal risk score of the user may not be visible to the system administrator. In some embodiments, security awareness system 202 may provide a notification to the system administrator that the personal risk score of the user is managed by security awareness system 202 but is not configured to be visible to the system administrator.
  • In an implementation, risk score calculator 222 may calculate a personal risk score for the user based on various factors such as whether the user has low or negligible personal information exposure, whether the user has moderate personal information exposure, and whether the user has high personal information exposure. In an example, these factors may be placed on an importance/weight scale, for example, from 1 to 10. In an implementation, risk score calculator 222 may assign an example weight range “0-3” to the factor “the user has negligible personal information exposure”, an example weight range “4-7” to the factor “the user has moderate personal information exposure”, and an example weight range “8-10” to the factor “the user has high personal information exposure”.
  • According to an embodiment, the calculated value of the personal risk score (for example, 0 to 10) may be based on a threshold of a number of instances that the user's personal information was found in a breach, the strength of the passwords provided by the user, whether the breach was found in one or more breach databases 208 1-N, whether the database containing the breach was a database solely enabled by security awareness system 202 or a public database, or whether the information was found through the exposure check or the security audit. In an implementation, risk score calculator 222 may adjust the personal risk score of the user based at least on a result of the exposure check and/or the security audit.
  • Further, the personal risk score for the user may be updated based on a weighted risk score for each instance of breach associate with the user's personal information. In some examples, each instance of a breached username of user may only be counted once towards the user's personal risk score, each instance of a breached password of the user may be counted 1.5 times towards the user's personal risk score, and each instance of a username of the user with a password of the user from the same breach may be counted 2 times towards the user's personal risk score. In an example, the personal risk score is incorporated into a risk score of the user that is calculated to determine the propensity that the user may respond to a malicious attack.
  • Risk score calculator 222 may also calculate an organization risk score for the organization of user. A description of such a system may be found in U.S. Pat. No. 10,673,876. In an example, risk score calculator 222 may calculate an organization risk score for the organization of user based on whether the user has interacted with malicious attacks in the past, whether the user received a high number of malicious attacks, and/or/ whether a job title of the user gives him or her expanded access to an organization network, any of which may pose a risk to the organization of the user. Examples of calculation of other types of risk scores that are not discussed here are contemplated herein and may be carried out by risk score calculator 222.
  • According to an embodiment, risk score calculator 222 may combine a personal risk score of a user with an organization risk score of the user's organization to generate a risk score. In an implementation, risk score calculator 222 may combine a personal risk score of a user with an organization risk score of the user's organization using an algorithm or algorithms. In an example, an example of such an algorithm may be a weighting algorithm which may be adjusted depending on the severity of the breach associated with the user, the kind of breach associated with the user, or any other metric measured by security awareness system 202. In an example, a personal risk score of the user and the organization risk score of the user's organization may be stored in risk score storage 232. In an implementation, security awareness system 202 may send a notification to a system administrator about a personal risk score of the and the organization risk score of the user's organization being combined.
  • In some embodiments, a personal risk score of a user, organization risk score of the user's organization, or any other risk score contemplated may be used separately by security awareness system 202. In some examples, risk score calculator 222 is further configured to determine an risk score of the user based at least on the personal risk score of the user.
  • In an example, one or more risk scores in addition to a personal risk score of a user and organization risk score of a user's organization may be contemplated. In this case, these one or more additional risk scores may be combined with each other, with the personal risk score of the user and/or with organization risk score of the user's organization, for example according to an algorithm performed by risk score calculator 222.
  • According to some embodiments, remediation unit 224 may be configured to perform remedial training directed to a user based on at least a personal risk score of the user. For example, remediation unit 224 may be configured to perform remedial training if a user's personal risk score exceeds a set level or if the user's personal risk score increases. In an example, remediation unit 224 may perform remedial training by tailoring training content to educate the user. In examples, remediation unit 224 may provide training to the user via a landing page hosted by security awareness system 202. In an example, the landing page may be a web page that enables provisioning of training materials to the user. For example, the landing page may provide to the user training related to choosing strong passwords and avoiding password reuse and password sharing.
  • In some embodiments, remediation unit 224 may prompt a user to change one or more passwords. In an implementation, remediation unit 224 may prompt a user to acknowledge a username, password and/or email address associated with the user that has been included in a breach. In an implementation, remediation unit 224 may require the user to render secure one or more of his or her organizational and/or personal accounts by changing one or more passwords. Further, remediation unit 224 may request the user to confirm that the user has secured one or more accounts. In an implementation, remediation unit 224 may provide a recommendation to the user to improve one or more of their personal passwords. Further, in an implementation, remediation unit 224 may recommend and facilitate remedial training to a user via an interaction with the user. In an example, remediation unit 224 may communicate and/or interact with the user using a pop-up message. A pop-up message may be understood to refer to the appearance of graphical or textual content displayed to a user. In examples, a message prompting a user to change one or more passwords of the user may be presented on a display as part of, or within, a “window” or a user interface element or a dialogue box. Other known examples and implementations of pop-up messages are contemplated herein as part of this disclosure.
  • In an implementation, remediation unit 224 may send one or more communications to a user to determine whether the user has completed training, changed one or more passwords (for example, one or more personal passwords of the user, which in examples cannot be viewed by security awareness system 202), or whether the user has completed any other form of remediation. In an example, on receiving the one or more communications, in examples the user may update personal information of the user with security awareness system 202. In examples, feedback to the user on one or more of security issues, security recommendations, and security remediations may be provided so as to protect the security and privacy of the user and to ensure that it is not possible to infer information about other users in the organization based on information of the user. In an example, if a personal password of a user of an organization is identified as similar to the personal password of another user of the organization, the personal password or similar personal passwords may be flagged to the users as a compromised password. In examples the users may be required to change the same or similar personal passwords. In an embodiment, indications, recommendations, or requirements made by remediation unit 224 to a user regarding reuse or security of the user's personal information are determined in a manner that ensures that the privacy of users of the security awareness system 202 is maintained.
  • According to an implementation, risk score calculator 222 may adjust a risk score for a user based on one or more actions that the user takes after remediation. I examples, if a user is prompted to change one or more of his or her organization passwords or personal password, risk score calculator 222 may adjust the risk score of the user based on the user's actions in response to the prompt. In an example, risk score calculator 222 may adjust a risk score of the user according to the timeframe in which the user performed an action in response to a prompt. In an implementation, risk score calculator 222 may determine whether a user changed one or more organizational password based on an interaction of the user with the organization's password system, for example, an Active Directory.
  • In an example, a user may register for a personal domain website using his or her organization login credentials, such as the user's organization email address and/or the user's password. In an implementation, detection unit 226 may detect whether the user has registered for the personal domain website using his or her organization login credentials as previously described. In an implementation, detection unit 226 may continuously or periodically monitor a mailbox of the user for certain types of email messages, such as registration email validation messages and promotional messages that are typically sent from personal domain websites.
  • According to an implementation, security awareness system 202 and detection unit 226 may use, and/or build and and/or maintain a database of email addresses, registration email validation messages and/or promotional messages that are known to be used by examples of personal domain websites to communicate with registered users of the personal domain website. In an implementation, such a database may be website information storage 234. In an implementation, detection unit 226 may monitor a mailbox of a user for messages from email addresses within website information storage 234. If any such email in the mailbox of the user is found within website information storage 234, detection unit 226 may determine that the user has registered for a personal domain website using the user's organization email address. In an example, detection unit 226 may maintain a list of email addresses used by personal domain websites, such as “amazon.com” and “twitter.com”. For example, an email address associated with the personal domain website “amazon.com” may be “store-news@amazon.com” and an email address associated with the personal domain website “twitter.com” may be “info@twitter.com.” In examples, if an email from the email address “info@twitter.com” is found in a mailbox of a user, detection unit 226 may determine that the user has registered for the personal domain website “twitter.com” using the user's organization email address.
  • According to an implementation, detection unit 226 may use a database of examples of registration email validation messages and promotional messages used by personal domain websites to communicate with registered users and may further build a query or queries with the aim of detecting the same or similar messages within mailbox of user. In an implementation, detection unit 226 may determine key content segments which are typical of examples of registration email validation messages and promotional messages used by personal domain websites to communicate with registered users and may combine key content segments together in one or more queries. Detection unit 226 may assign a search score for one or more emails in a user's mailbox based on one or more queries and determine from the search score the likelihood of an email being either a registration email validation message or a promotional message. In an example, if three key content segments are found in an email, then a search score for the message may be 3. In some examples, if nine key content segments are found in an email, then a search score for the email may be 9. In an implementation, detection unit 226 may generate a search score for each of the one or more emails found in the user's mailbox. In a further example, a threshold score of the search score for an email for the email to be classified as a registration email validation message may be different from a threshold score of the search score for an email for the email to be classified as a promotional messages. For example, a threshold score for the search score of an email for the email to be classified as a registration email validation message may be 4 and the threshold score for the search score of an email for the email to be classified as a promotional message may be 8. Accordingly, if an email found in a mailbox of the user has a search score of 5, detection unit 226 may determine that the email is a registration email validation message. Further, if an email found in a mailbox of the user has a search score of 9, then detection unit 226 may determine that the email is a promotional message.
  • According to some embodiments, security awareness system 202 and detection unit 226 may use, build and/or maintain a list of login pages for known or popular personal domain websites, such as food delivery service websites (for example, such as Door Dash® and Uber Eats®), travel websites (for example, Hotels.com® and Airbnb™), shopping websites (for example, amazon.com and walmart.com), and other such websites. In an example implementation, detection unit 216 may store the list of login pages for known or popular personal domain websites in website information storage 234. Periodically or in response to an event such as the detection of a suspicious email message in a user's mailbox, detection unit 226 may attempt to login to each of the personal domain websites' login pages using the organization login credentials of the user. If a login attempt with the user's organization login credentials is successful for a personal domain website, detection unit 226 may determine that the user has registered for the personal domain website using his or her organization login credentials.
  • In some embodiments, detection unit 226, using a list of login pages for personal domain websites stored in website information storage 234, may access a reset password link of one or more accounts of personal domain websites which in examples presents an interface to enter the email address associated with the forgotten password. Detection unit 226 may provide the user's organization email address as the email address for the account and if an email is sent to the user's organization email address with a password reset link, then detection unit 226 may determine that the user has registered for the personal domain website using the user's organization login credentials. In an implementation, for example to avoid bulk traffic to a single website, detection unit 226 may use this approach for randomized or periodic checks of one or more users, rather than checking all the users in the organization at the same time. In examples, detection unit 226 may use a plurality of IP addresses to access a reset password link of one or more accounts of personal domain websites to avoid appearing as a DoS attack.
  • In an implementation, upon detecting emails or messages such as email validation messages, promotional messages, or other messages received from the personal domain website in a mailbox of a user, security awareness system 202 may delete the detected emails. In examples security awareness system 202 may prompt or require a user to change his or her organization password or personal passwords. In further examples security awareness system 202 may deliver remedial training to the user. In some implementations, if security awareness system 202 determines that a user has employed organization login credentials in registering for a personal domain website, detection unit 226 may interact with email server 206 and trigger email server 206 to disable the user's organization password and require the user to create a new organization login credentials, for example a new organization password, for example, before the user is able to access various servers and services of the organization.
  • According to an embodiment, security awareness system 202 may implement monitoring processes in a manner that respects the privacy laws of a country in which the organization of the user and/or the user is located. In an example, security awareness system 202 may obfuscate a user's personal information from a system administrator, and the user may have the option to manage the amount of personal information that is to be made visible to the system administrator. In examples, if a user's personal or work email account is deactivated or archived, or the user de-registers some or all of their personal information from security awareness system 202, then security awareness system 202 may remove the user's personal information for example as a best practice and/or to comply with various privacy regulations such as General Data Protection Regulation (GDPR).
  • In one or more embodiments, a user may choose to de-register his or her personal information from security awareness system 202 at any time. In such a scenario, all personal information and other relevant records pertaining to the user may be removed or deleted from security awareness system 202. In an example, a personal risk score of the user may be removed from risk score storage 232, however the effect that the personal risk score of the user contributed to the risk score of the user may persist despite the user de-registering their personal information. In some examples, some or all of the component of risk score of the user contributed by the personal risk score of the user f is reversed, if the user de-register his or her personal information from security awareness system 202.
  • FIG. 3 depicts flowchart 300 for detecting that the user has registered for a personal domain website using an organization email address, according to some embodiments.
  • Step 302 includes generating a query using key content segments determined based on at least one of known registration email validation messages and known promotional messages from personal domain websites. In an example, registration email validation messages and promotional messages from various personal domain websites may include certain key content segments in common which may be combined and reused to generate a query. In an implementation, key content segments used to build queries in the past may be altered to increase the probability of detecting one or more registration email validation messages and/or promotional message.
  • In an implementation, detection unit 226 may generate a query of key content segments that appear in the known registration email validation messages and promotional messages from personal domain websites. In an example, detection unit 226 may alter key content segments used to build queries in the past to increase the probability of detecting one or more registration email validation messages and/or promotional messages based on characteristics of known or sample registration email validation messages and/or promotional messages. In an implementation, detection unit 216 may apply an AI model to generate the query.
  • Step 304 includes monitoring, by detection unit 226 using a generated query, a mailbox of a user to detect one or more emails which include key content segments. In an implementation, detection unit 226 may be configured to monitor one or more or all folders of the mailbox of the user, including for example an inbox folder, a junk email folder, a deleted items folder, and one or more spam email folders. In an implementation, detection unit 226 may be configured to train an AI model to detect one or more emails which include key content segments, for example using sample or known registration email validation messages and/or promotional messages, the AI model to be used for the purpose of recognizing registration email validation messages and/or promotional in a user's mailbox.
  • Step 306 includes generating, for example by detection unit 226, a search score for one or more emails in the user's mailbox. In examples, the search score is based on key content segments found in the one or more emails. In an implementation, detection unit 226 may generate a search score for all emails found in the user's mailbox. In examples, detection unit 226 may generate a search score for one or more emails found in the user's mailbox that include a minimum number of key content segments.
  • Step 308 includes determining an email from the mailbox of the user to be one of a registration email validation message or a promotional message based. In examples, detection unit 226 determines one or more emails from the mailbox of the user to be one of a registration email validation message or a promotional message based on the search scores of the one or more emails exceeding a threshold score. In an implementation, detection unit 226 may determine an email from the one or more folders of the mailbox of the user to be one of a registration email validation message or a promotional message based on the search score of the emails exceeding a threshold score. In examples, detection unit 226 may determine that an email from the one or more folders of the mailbox of the user is a registration email validation message based on the search score of the email being less than, equal to, or greater than a first threshold score, and detection unit 226 may determine that an email from the one of more folders of the mailbox of the user is a promotional message based on the search score of the email being less than, equal to, or greater than a second threshold.
  • Step 310 includes performing one or more actions based on determining that the email is one of a registration email validation message or a promotional message. In examples, the one or more actions may be performed by remediation unit 224. In examples, the one or more actions performed by remediation unit 224 based on determining that the email is one of a registration email validation message or a promotional message. include one or more of deleting the email from the user's mailbox, prompting the user to change an organization password, and providing training to the user on personal domain use of organization login credentials. In an implementation, if detection unit 226 determines that the email is one of a registration email validation message or a promotional message, remediation unit 224 may perform one or more actions separately or in combination. In examples, wherein remediation unit 224 performs more than one action, the more than one actions performed by remediation unit 224 are performed at different times, for example separated by a minimum time period.
  • In an implementation, if determination unit 226 determines that an email from the one or more folders of the mailbox of the user is a registration email validation message or a promotion message, security awareness system 202 may remove the registration email validation message or the promotional message from the user's mailbox such that the user may not validate the personal domain registration through interacting with the registration email validation message, or such that the user may not interact with the promotional message. In an implementation, responsive to determination unit 226 determining that an email message from the one or more folders of the mailbox of the user is a registration email validation message or a promotional message, security awareness system 202 may prompt or require the user to change their organization password. In some implementations, detection unit 226 may interact with email server 206 and trigger email server 206 to disable the user's organization password and require the user to create a new organization password. In implementations, remediation unit 224 may provide training to the user.
  • FIG. 4 depicts flowchart 400 for using personal information for determining a personal risk score of the user of the organization, according to some embodiments.
  • Step 402 includes receiving registration of personal information of a user of an organization. In an implementation, security awareness system 202 may send a request to the user asking the user to provide his or her personal information. In some embodiments, the user may be asked to voluntarily provide his or her personal information and the user may choose to provide his or her personal information or may choose not to provide his or her personal information. In some embodiments, the user may be required to provide some personal information in order for the user to gain access to one or more services or systems of the organization. Upon receiving a voluntary request, the user may opt to provide the personal information to security awareness system 202. In some embodiments, a user's personal information may include one or more personal email addresses of the user. In some embodiments, a user's personal information may include one or more personal usernames of the user. In some embodiments, a user's personal information may include on or more personal passwords of the user.
  • Step 404 includes performing at least one of an exposure check or a security audit of the personal information of the user. In an implementation, exposure check unit 218 may be configured to perform an exposure check on some or all of the personal information of the user that was provided by the user. In examples, exposure check unit may perform an exposure check on some or all of the personal information of the user by searching for one or more of a user's personal email address, a user's personal username, or a user's personal password in one or more breach databases 208 1-N. In examples, exposure check unit 218 may be configured to store the results of one or more exposure checks on the personal information provided by the user in memory, for example memory 214 or risk score storage 232 or personal information storage 230.
  • In an implementation, security audit unit 220 may be configured to perform a security audit on some or all of the personal information of the user that was provided by the user. In examples, security audit unit 220 performs a security audit of one or more personal passwords of the user by assessing a strength of the one or more personal passwords of the user. In examples, security audit unit 220 performs a security audit of one or more personal passwords of the user by comparing the one or more personal passwords of the user to password requirements of the organization. In examples, security audit unit 220 may be configured to store the results of one or more security audits on personal information provided by the user in memory, for example memory 214 or risk score storage 232 or personal information storage 230.
  • Step 406 includes adjusting a personal risk score of the user based at least on a result of one or more of an exposure check and/or a security audit on personal information provided by the user. In an implementation, risk score calculator 222 may be configured to generate a personal risk score for a user based at least on the result of one of an exposure check or a security audit on personal information provided by the user. In an implementation, risk score calculator 222 may be configured to adjust a previously determined personal risk score of a user based at least on the result of one of the exposure check or the security audit on personal information provided by the user. According to an embodiment, risk score calculator 222 may adjust the personal risk score of the user based at least on the user voluntarily registering personal information with security awareness system 202. In an implementation, risk score calculator 222 may determine a risk score for a user based at least on the personal risk score of the user.
  • FIG. 5 depicts flowchart 500 for performing a remedial training or a simulated phishing campaign directed to the user based on a personal risk score of the user, according to some embodiments.
  • Step 502 includes receiving personal information of a user of an organization, for example as described previously in step 402 of FIG. 4 . In an implementation, security awareness system 202 may send a request to the user to provide his or her personal information. Upon receiving the request or otherwise, the user may opt to provide personal information to security awareness system 202. In some examples, the personal information provided by the user of the organization includes one or more personal email addresses.
  • Step 504 includes verifying a user's personal email address is used in a personal domain of the user. In an implementation, verification unit 216 may verify the email address identified by the personal information and used in the personal domain of the user by attempting to login in to the personal domain with one or more of the personal email addresses provided by the user.
  • Step 506 includes storing the personal information of the user in response verification unit 216 verifying that one or more personal email address of the user has been used in the personal domain of the user. In an implementation, verification unit 216 may be configured to store the personal information of the user in response to the email address used in the personal domain of the user being verified. In an implementation, verification unit 216 may store the email address used in the personal domain of the user in user personal information storage 230. In an example, the email address may be stored in an encrypted or obfuscated form. In an implementation verification unit 216 may associate the email address used in the personal domain of the user with a security awareness profile stored in risk score storage 232. In some examples, access to the one or more personal email addresses of the user stored in user personal information storage 230 may be limited to some services or systems of the user's organization.
  • Step 508 includes performing at least one of an exposure check or a security audit of the personal information of the user, for example as was described in step 404 of FIG. 4 . In an implementation, exposure check unit 218 may be configured to perform the exposure check by searching using at least one personal email address or personal username of the user provided by the user in the user's personal information in one or more breach databases 208 1-N. Further, in an implementation, security audit unit 220 may be configured to perform a security audit by assessing a strength of one or more personal passwords from the user's personal information and compliance of the one or more personal passwords of the user to the password requirements of the organization.
  • Step 510 includes adjusting a personal risk score of the user based at least on a result of one of the exposure check or the security audit, for example as described in step 406 of FIG. 4 . In an implementation, risk score calculator 222 may be configured to adjust the personal risk score of the user based at least on the result of one of the exposure check or the security audit. According to an embodiment, risk score calculator 222 may adjust the personal risk score of the user based at least on the user's registration of the personal information with security awareness system 202. Also, in an implementation, risk score calculator 222 may determine an overall risk score for the user based at least on the personal risk score of the user.
  • Step 512 includes performing one of a remedial training or a simulated phishing campaign directed to the user based on the personal risk score of the user. In an implementation, remediation unit 224 may be configured to perform one or more remedial training and security awareness system 202 may be configured to perform one or more simulated phishing campaign directed to the user based on a set level, an increase, or a decrease in the user's personal risk score.
  • While various embodiments of the methods and systems have been described, these embodiments are illustrative and in no way limit the scope of the described methods or systems. Those having skill in the relevant art can effect changes to form and details of the described methods and systems without departing from the broadest scope of the described methods and systems. Thus, the scope of the methods and systems described herein should not be limited by any of the illustrative embodiments and should be defined in accordance with the accompanying claims and their equivalents.

Claims (20)

What is claimed is:
1. A method comprising:
performing, by one or more processors, at least one of an exposure check against one or more breach databases or a security audit of personal information registered by a user of an organization;
determining, by the one or more processors, a personal risk score of the user based at least on a result of one of the exposure check or the security audit, the personal risk score representing a level of risk that the personal information of the user presents to the organization; and
causing, by the one or more processors based at least on the personal risk score of the user, one of a computer-based remedial training or a simulated phishing campaign directed to the user.
2. The method of claim 1, further comprising receiving, by the one or more processors, an action of the user to register the personal information.
3. The method of claim 1, further comprising setting, by the one or more processors, the personal risk score of the user at the level of risk that indicates an action of the user opting-in to register the personal information.
4. The method of claim 1, further comprising setting, by the one or more processors if the personal information registered by the user provides more than one email address, the personal risk score of the user at a lower level than a personal risk score of a second user that provided one email address in the personal information registered by the second user.
5. The method of claim 1, further comprising setting, by the one or more processors if the personal information registered by the user provides password information, the personal risk score of the user at a lower level than a personal risk score of a second user that does not provide password information in the personal information registered by the second user.
6. The method of claim 1, further comprising causing, by the one or more processors, the computer-based remedial training to display a pop-up message to communicate with the user.
7. The method of claim 1, further comprising causing, by the one or more processors, the display of a prompt requiring the user to create a new password with the organization of the user.
8. The method of claim 1, wherein the personal risk score identifies a level of risk for a group of users.
9. The method of claim 1, further comprising updating, by the one or more processors, the personal risk score to one of increase or decrease the personal risk score.
10. The method of claim 1, wherein the personal risk score comprises at least a component representing a willingness of the user to register the personal information to the organization to which the user is not obliged to register the personal information.
11. A system comprising:
one or more processors, coupled to memory and configured to:
perform at least one of an exposure check against one or more breach databases or a security audit of personal information registered by a user of an organization;
determine a personal risk score of the user based at least on a result of one of the exposure check or the security audit, the personal risk score representing a level of risk that the personal information of the user presents to the organization; and
cause, based at least on the personal risk score of the user, one of a computer-based remedial training or a simulated phishing campaign directed to the user.
12. The system of claim 11, wherein the one or more processors are further configured to receive an action of the user to register the personal information.
13. The system of claim 11, wherein the one or more processors are further configured to set the personal risk score of the user at the level of risk that indicates an action of the user opting-in to register the personal information.
14. The system of claim 11, wherein the one or more processors are further configured to set, if the personal information registered by the user provides more than one email address, the personal risk score of the user at a lower level than a personal risk score of a second user that provided one email address in the personal information registered by the second user.
15. The system of claim 11, wherein the one or more processors are further configured to set, if the personal information registered by the user provides password information, the personal risk score of the user at a lower level than a personal risk score of a second user that does not provide password information in the personal information registered by the second user.
16. The system of claim 11, wherein the one or more processors are further configured to cause the computer-based remedial training to display a pop-up message to communicate with the user.
17. The system of claim 11, wherein the one or more processors are further configured to cause the display of a prompt requiring the user to create a new password with the organization of the user.
18. The system of claim 11, wherein the personal risk score identifies a level of risk for a group of users.
19. The system of claim 11, wherein the one or more processors are further configured to update the personal risk score to one of increase or decrease the personal risk score.
20. The system of claim 11, wherein the personal risk score comprises at least a component representing a willingness of the user to register the personal information to the organization to which the user is not obliged to register the personal information.
US18/094,628 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns Abandoned US20230164166A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/094,628 US20230164166A1 (en) 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/001,070 US11552982B2 (en) 2020-08-24 2020-08-24 Systems and methods for effective delivery of simulated phishing campaigns
US18/094,628 US20230164166A1 (en) 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/001,070 Continuation US11552982B2 (en) 2020-08-24 2020-08-24 Systems and methods for effective delivery of simulated phishing campaigns

Publications (1)

Publication Number Publication Date
US20230164166A1 true US20230164166A1 (en) 2023-05-25

Family

ID=74537211

Family Applications (5)

Application Number Title Priority Date Filing Date
US17/001,070 Active US11552982B2 (en) 2020-08-24 2020-08-24 Systems and methods for effective delivery of simulated phishing campaigns
US17/002,340 Active US10917429B1 (en) 2020-08-24 2020-08-25 Systems and methods for effective delivery of simulated phishing campaigns
US17/175,892 Active US11038914B1 (en) 2020-08-24 2021-02-15 Systems and methods for effective delivery of simulated phishing campaigns
US18/094,628 Abandoned US20230164166A1 (en) 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns
US18/094,632 Active US11729206B2 (en) 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US17/001,070 Active US11552982B2 (en) 2020-08-24 2020-08-24 Systems and methods for effective delivery of simulated phishing campaigns
US17/002,340 Active US10917429B1 (en) 2020-08-24 2020-08-25 Systems and methods for effective delivery of simulated phishing campaigns
US17/175,892 Active US11038914B1 (en) 2020-08-24 2021-02-15 Systems and methods for effective delivery of simulated phishing campaigns

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/094,632 Active US11729206B2 (en) 2020-08-24 2023-01-09 Systems and methods for effective delivery of simulated phishing campaigns

Country Status (1)

Country Link
US (5) US11552982B2 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10348761B2 (en) * 2017-12-01 2019-07-09 KnowBe4, Inc. Systems and methods for situational localization of AIDA
US11870807B2 (en) * 2019-11-19 2024-01-09 Jpmorgan Chase Bank, N.A. System and method for phishing email training
US11928212B2 (en) * 2020-06-15 2024-03-12 Proofpoint, Inc. Generating simulated spear phishing messages and customized cybersecurity training modules using machine learning
US11496514B2 (en) * 2020-07-31 2022-11-08 KnowBe4, Inc. Systems and methods for security awareness using ad-based simulated phishing attacks
US11552982B2 (en) * 2020-08-24 2023-01-10 KnowBe4, Inc. Systems and methods for effective delivery of simulated phishing campaigns
US11356480B2 (en) * 2020-08-26 2022-06-07 KnowBe4, Inc. Systems and methods of simulated phishing campaign contextualization
EP4128002B1 (en) * 2020-10-30 2023-07-12 Knowbe4, Inc. Systems and methods for determination of level of security to apply to a group before display of user data

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6067561A (en) * 1997-02-07 2000-05-23 Hughes Electronics Corporation Electronic mail notification system and method within a hybrid network that transmits notifications via a continuous, high-speed channel
US20020038347A1 (en) * 2000-09-22 2002-03-28 Sanyo Electric Co., Ltd Electronic mail distributing method and apparatus
US20080235335A1 (en) * 2007-03-20 2008-09-25 International Business Machines Corporation Method, system, and computer program product for changing the status of an existing email in a recipient email client inbox
US20090089391A1 (en) * 2007-09-28 2009-04-02 Embarq Holdings Company, Llc Flashing email header
US20140101101A1 (en) * 2012-10-09 2014-04-10 International Business Machines Corporation Relaxed anchor validation in a distributed synchronization environment
US20170104785A1 (en) * 2015-08-10 2017-04-13 Salvatore J. Stolfo Generating highly realistic decoy email and documents
US20170272388A1 (en) * 2016-03-15 2017-09-21 Assaf Yossef Bern Integrated real-time email-based virtual conversation
US20190173918A1 (en) * 2017-12-01 2019-06-06 KnowBe4, Inc. Systems and methods for aida based a/b testing
US20200097912A1 (en) * 2018-09-20 2020-03-26 Microsoft Technology Licensing, Llc Surfacing select electronic messages in computing systems
US20200366713A1 (en) * 2019-05-01 2020-11-19 Jasmine Rodriguez Systems and methods for use of address fields in a simulated phishing attack
US20210021612A1 (en) * 2015-04-10 2021-01-21 Cofense Inc Message platform for automated threat simulation, reporting, detection, and remediation
US20210152596A1 (en) * 2019-11-19 2021-05-20 Jpmorgan Chase Bank, N.A. System and method for phishing email training

Family Cites Families (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100082749A1 (en) * 2008-09-26 2010-04-01 Yahoo! Inc Retrospective spam filtering
US20150229664A1 (en) 2014-02-13 2015-08-13 Trevor Tyler HAWTHORN Assessing security risks of users in a computing network
US10749887B2 (en) 2011-04-08 2020-08-18 Proofpoint, Inc. Assessing security risks of users in a computing network
US10586049B2 (en) * 2011-12-22 2020-03-10 International Business Machines Corporation Detection of second order vulnerabilities in web services
US9241009B1 (en) 2012-06-07 2016-01-19 Proofpoint, Inc. Malicious message detection and processing
US9154514B1 (en) * 2012-11-05 2015-10-06 Astra Identity, Inc. Systems and methods for electronic message analysis
US9356948B2 (en) 2013-02-08 2016-05-31 PhishMe, Inc. Collaborative phishing attack detection
US8966637B2 (en) 2013-02-08 2015-02-24 PhishMe, Inc. Performance benchmarking for simulated phishing attacks
US9398038B2 (en) 2013-02-08 2016-07-19 PhishMe, Inc. Collaborative phishing attack detection
US9253207B2 (en) 2013-02-08 2016-02-02 PhishMe, Inc. Collaborative phishing attack detection
US9053326B2 (en) 2013-02-08 2015-06-09 PhishMe, Inc. Simulated phishing attack with sequential messages
JP6070316B2 (en) 2013-03-19 2017-02-01 富士通株式会社 Legitimacy judgment method, legitimacy judgment program, and legitimacy judgment device
US9262629B2 (en) 2014-01-21 2016-02-16 PhishMe, Inc. Methods and systems for preventing malicious use of phishing simulation records
IL232528A0 (en) * 2014-05-08 2014-08-31 Rami Puzis Social network honeypot
US20190215335A1 (en) 2014-10-30 2019-07-11 Ironscales Ltd. Method and system for delaying message delivery to users categorized with low level of awareness to suspicius messages
US9906539B2 (en) 2015-04-10 2018-02-27 PhishMe, Inc. Suspicious message processing and incident response
US10298602B2 (en) 2015-04-10 2019-05-21 Cofense Inc. Suspicious message processing and incident response
WO2016164844A1 (en) 2015-04-10 2016-10-13 PhishMe, Inc. Message report processing and threat prioritization
US9635052B2 (en) 2015-05-05 2017-04-25 Christopher J. HADNAGY Phishing as-a-service (PHaas) used to increase corporate security awareness
US10540724B2 (en) * 2015-05-06 2020-01-21 Branch Banking And Trust Company Electronic receipt-linking database system
US9674213B2 (en) * 2015-10-29 2017-06-06 Duo Security, Inc. Methods and systems for implementing a phishing assessment
US9894092B2 (en) * 2016-02-26 2018-02-13 KnowBe4, Inc. Systems and methods for performing or creating simulated phishing attacks and phishing attack campaigns
US10432656B2 (en) * 2016-04-28 2019-10-01 Shevirah Inc. Method and system for assessing data security
US10069856B2 (en) * 2016-05-13 2018-09-04 King Abdulaziz City For Science And Technology System and method of comparative evaluation for phishing mitigation
US9800613B1 (en) 2016-06-28 2017-10-24 KnowBe4, Inc. Systems and methods for performing a simulated phishing attack
US10986122B2 (en) 2016-08-02 2021-04-20 Sophos Limited Identifying and remediating phishing security weaknesses
US9912687B1 (en) 2016-08-17 2018-03-06 Wombat Security Technologies, Inc. Advanced processing of electronic messages with attachments in a cybersecurity system
US10855714B2 (en) 2016-10-31 2020-12-01 KnowBe4, Inc. Systems and methods for an artificial intelligence driven agent
US11044267B2 (en) 2016-11-30 2021-06-22 Agari Data, Inc. Using a measure of influence of sender in determining a security risk associated with an electronic message
US9876753B1 (en) 2016-12-22 2018-01-23 Wombat Security Technologies, Inc. Automated message security scanner detection system
US9749360B1 (en) 2017-01-05 2017-08-29 KnowBe4, Inc. Systems and methods for performing simulated phishing attacks using social engineering indicators
US20180307844A1 (en) 2017-04-21 2018-10-25 KnowBe4, Inc. Using smart groups for simulated phishing training and phishing campaigns
US10362047B2 (en) 2017-05-08 2019-07-23 KnowBe4, Inc. Systems and methods for providing user interfaces based on actions associated with untrusted emails
US10243904B1 (en) * 2017-05-26 2019-03-26 Wombat Security Technologies, Inc. Determining authenticity of reported user action in cybersecurity risk assessment
US9781160B1 (en) * 2017-05-31 2017-10-03 KnowBe4, Inc. Systems and methods for discovering suspect bot IP addresses and using validated bot IP address to ignore actions in a simulated phishing environment
US11599838B2 (en) * 2017-06-20 2023-03-07 KnowBe4, Inc. Systems and methods for creating and commissioning a security awareness program
US20190026461A1 (en) * 2017-07-20 2019-01-24 Barracuda Networks, Inc. System and method for electronic messaging threat scanning and detection
WO2019027837A1 (en) 2017-07-31 2019-02-07 KnowBe4, Inc. Systems and methods for using attribute data for system protection and security awareness training
US11159565B2 (en) * 2017-08-31 2021-10-26 Barracuda Networks, Inc. System and method for email account takeover detection and remediation
US10348761B2 (en) * 2017-12-01 2019-07-09 KnowBe4, Inc. Systems and methods for situational localization of AIDA
US10679164B2 (en) * 2017-12-01 2020-06-09 KnowBe4, Inc. Systems and methods for using artificial intelligence driven agent to automate assessment of organizational vulnerabilities
US10839083B2 (en) * 2017-12-01 2020-11-17 KnowBe4, Inc. Systems and methods for AIDA campaign controller intelligent records
US10673895B2 (en) * 2017-12-01 2020-06-02 KnowBe4, Inc. Systems and methods for AIDA based grouping
US10348762B2 (en) 2017-12-01 2019-07-09 KnowBe4, Inc. Systems and methods for serving module
US11119632B2 (en) 2018-01-03 2021-09-14 Mimecast Services Ltd. Systems and methods for proactive analysis of artifacts associated with information resources
US10924517B2 (en) 2018-02-07 2021-02-16 Sophos Limited Processing network traffic based on assessed security weaknesses
DE102018113994A1 (en) 2018-06-12 2019-12-12 IT-Seal GmbH A method of determining a level of deception for a single phishing attack against a person
US11212312B2 (en) * 2018-08-09 2021-12-28 Microsoft Technology Licensing, Llc Systems and methods for polluting phishing campaign responses
US11257393B2 (en) * 2018-10-26 2022-02-22 Circadence Corporation Method and system for evaluating individual and group cyber threat awareness
US10979448B2 (en) * 2018-11-02 2021-04-13 KnowBe4, Inc. Systems and methods of cybersecurity attack simulation for incident response training and awareness
LU101105B1 (en) 2019-01-17 2020-07-17 It Seal Gmbh Process for the automated creation of a phishing document addressed to a specified person
US11487873B2 (en) * 2019-01-22 2022-11-01 EMC IP Holding Company LLC Risk score generation utilizing monitored behavior and predicted impact of compromise
US20200267183A1 (en) 2019-02-15 2020-08-20 Avant Research Group, LLC Systems and methods for vulnerability analysis of phishing attacks
US10453017B1 (en) 2019-03-07 2019-10-22 Lookout, Inc. Computer systems and methods to protect user credential against phishing
US11481486B2 (en) 2019-03-27 2022-10-25 Webroot Inc. Behavioral threat detection engine
US20210194924A1 (en) 2019-08-29 2021-06-24 Darktrace Limited Artificial intelligence adversary red team
US11489868B2 (en) * 2019-09-05 2022-11-01 Proofpoint, Inc. Dynamically initiating and managing automated spear phishing in enterprise computing environments
US20210090463A1 (en) * 2019-09-09 2021-03-25 Circadence Corporation Method and system for training non-technical users on highly complex cyber security topics
US11336675B2 (en) 2019-09-20 2022-05-17 Bank Of America Corporation Cyber resilience chaos stress testing
US11729200B2 (en) 2019-12-12 2023-08-15 Proofpoint, Inc. Dynamic message analysis platform for enhanced enterprise security
US10904186B1 (en) 2020-03-27 2021-01-26 Etorch, Inc. Email processing for enhanced email privacy and security
US11552982B2 (en) * 2020-08-24 2023-01-10 KnowBe4, Inc. Systems and methods for effective delivery of simulated phishing campaigns
WO2022046652A1 (en) 2020-08-24 2022-03-03 CyberCatch, Inc. Automated and continuous cybersecurity assessment with measurement and scoring
US20220094702A1 (en) 2020-09-24 2022-03-24 University Of Windsor System and Method for Social Engineering Cyber Security Training
WO2022071961A1 (en) 2020-10-01 2022-04-07 Vade Secure Inc. Automated collection of branded training data for security awareness training
US20220130274A1 (en) 2020-10-26 2022-04-28 Proofpoint, Inc. Dynamically Injecting Security Awareness Training Prompts Into Enterprise User Flows
US20220286419A1 (en) 2021-03-02 2022-09-08 Proofpoint, Inc. System and method for improving detection of bad content by analyzing reported content

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6067561A (en) * 1997-02-07 2000-05-23 Hughes Electronics Corporation Electronic mail notification system and method within a hybrid network that transmits notifications via a continuous, high-speed channel
US20020038347A1 (en) * 2000-09-22 2002-03-28 Sanyo Electric Co., Ltd Electronic mail distributing method and apparatus
US20080235335A1 (en) * 2007-03-20 2008-09-25 International Business Machines Corporation Method, system, and computer program product for changing the status of an existing email in a recipient email client inbox
US20090089391A1 (en) * 2007-09-28 2009-04-02 Embarq Holdings Company, Llc Flashing email header
US20140101101A1 (en) * 2012-10-09 2014-04-10 International Business Machines Corporation Relaxed anchor validation in a distributed synchronization environment
US20210021612A1 (en) * 2015-04-10 2021-01-21 Cofense Inc Message platform for automated threat simulation, reporting, detection, and remediation
US20170104785A1 (en) * 2015-08-10 2017-04-13 Salvatore J. Stolfo Generating highly realistic decoy email and documents
US20170272388A1 (en) * 2016-03-15 2017-09-21 Assaf Yossef Bern Integrated real-time email-based virtual conversation
US20190173918A1 (en) * 2017-12-01 2019-06-06 KnowBe4, Inc. Systems and methods for aida based a/b testing
US20200097912A1 (en) * 2018-09-20 2020-03-26 Microsoft Technology Licensing, Llc Surfacing select electronic messages in computing systems
US20200366713A1 (en) * 2019-05-01 2020-11-19 Jasmine Rodriguez Systems and methods for use of address fields in a simulated phishing attack
US20210152596A1 (en) * 2019-11-19 2021-05-20 Jpmorgan Chase Bank, N.A. System and method for phishing email training

Also Published As

Publication number Publication date
US11552982B2 (en) 2023-01-10
US10917429B1 (en) 2021-02-09
US20230164167A1 (en) 2023-05-25
US11038914B1 (en) 2021-06-15
US20220060495A1 (en) 2022-02-24
US11729206B2 (en) 2023-08-15

Similar Documents

Publication Publication Date Title
US20230164166A1 (en) Systems and methods for effective delivery of simulated phishing campaigns
US11729203B2 (en) System and methods of cybersecurity attack simulation for incident response training and awareness
US11902324B2 (en) System and methods for spoofed domain identification and user training
US11640457B2 (en) System and methods for minimizing organization risk from users associated with a password breach
US20230081399A1 (en) Systems and methods for enrichment of breach data for security awareness training
US11489869B2 (en) Systems and methods for subscription management of specific classification groups based on user's actions
US11552984B2 (en) Systems and methods for improving assessment of security risk based on personal internet account data
US11943253B2 (en) Systems and methods for determination of level of security to apply to a group before display of user data
US20210365866A1 (en) Systems and methods for use of employee message exchanges for a simulated phishing campaign
US20230171283A1 (en) Automated effective template generation
US20220166784A1 (en) Systems and methods identifying malicious communications in multiple message stores
US20230038258A1 (en) Systems and methods for analysis of user behavior to improve security awareness
US20240096234A1 (en) System and methods for user feedback on receiving a simulated phishing message

Legal Events

Date Code Title Description
AS Assignment

Owner name: KNOWBE4, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KRAS, GREG;REEL/FRAME:062314/0643

Effective date: 20211122

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION