US20030005287A1 - System and method for extensible positive client identification - Google Patents

System and method for extensible positive client identification Download PDF

Info

Publication number
US20030005287A1
US20030005287A1 US10/228,786 US22878602A US2003005287A1 US 20030005287 A1 US20030005287 A1 US 20030005287A1 US 22878602 A US22878602 A US 22878602A US 2003005287 A1 US2003005287 A1 US 2003005287A1
Authority
US
United States
Prior art keywords
client
terminal
asr
information
security
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/228,786
Inventor
David Wray
David Blanchfield
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Authoriszor Inc
Original Assignee
Authoriszor Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Authoriszor Inc filed Critical Authoriszor Inc
Priority to US10/228,786 priority Critical patent/US20030005287A1/en
Publication of US20030005287A1 publication Critical patent/US20030005287A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to devices or network resources
    • H04L63/105Multiple levels of security
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0876Network architectures or network communication protocols for network security for authentication of entities based on the identity of the terminal or configuration, e.g. MAC address, hardware or software configuration or device fingerprint
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1441Countermeasures against malicious traffic
    • H04L63/1491Countermeasures against malicious traffic using deception as countermeasure, e.g. honeypots, honeynets, decoys or entrapment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2113Multi-level security, e.g. mandatory access control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2127Bluffing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1441Countermeasures against malicious traffic

Definitions

  • the present invention relates generally to the field of providing security for a location in a network and more particularly to an extensible positive client identification system and method.
  • Confidentiality begins by identifying the requestor of confidential information. This, in turn, means not only identifying a valid requester, but also detecting when an imposter or thief is impersonating a valid requester to gain access to confidential information. In many cases it is also true that a valid requestor may only be authorized to have access to a particular level of information.
  • An employee database for example, which contains salary information may have several different levels of access. An individual employee may only be authorized to access his or her salary information, while the head of the personnel department may have access to all salary data. Non-employees may be denied access to any employee data—hence the importance of identification.
  • Availability means simply that information should not be withheld improperly when it is requested. Many factors can affect availability over a network, such as hardware malfunction, software malfunction, data corruption, or the failure or slowing down of communications links.
  • logon names are usually very easy to discover. Many organizations select a standard format for them based on the user's real identity. Fred Smith, for example, may be given a logon name of “fsmith” or “freds”. A hacker familiar with a user's real name may find it easy to deduce this kind of logon name. Many computer systems that require logon names also have default settings that are used when the system is first configured. Many users simply keep these default account names. Thus, a hacker familiar with the NTTM operating system provided by Microsoft, Inc. of Redmund Wash., might try the ‘Administrator’ account. Default account names and passwords greatly reduce the amount of work required for the hacker to gain illicit entry to a system. Hackers may use software attacks to obtain passwords by copying password files.
  • Digital IDs Digital IDs
  • Digital Certificates Digital Certificates
  • TPCA Trusted Third Party Certificate authorities
  • a Digital Certificate is a series of characters containing an identifier and usually other verification information.
  • the certificate or id may be stored in a computer file—as seen in FIG. 2 (Prior Art), at disk 03 connected with a computer terminal 02 , or on some other memory device such as a smart card. When the id is read by the appropriate software it is possible to use that id for identification purposes.
  • digital certificates can be copied from a computer terminal 02 such as the one shown in FIG. 2 (Prior Art), and used to impersonate the user. They can also be stolen remotely while the user is using the Internet. For example, a hacker at terminal 13 can use the Internet 25 and communications networks 30 and 10 to find and copy a certificate stored on a disk at personal computer terminal 02 .
  • Trusted Third Party Certificate authorities can be used to create and issue digital certificates for a company. To obtain a certificate from a certificate authority usually requires proof of identity. The certifying authority then uses its own digital certificate to generate one for the requestor. The degree of stringency and cost varies from authority to authority. At the highest levels of security, it can take several months to obtain one, and require high levels of proof of identity as well as expense. Certificates for large corporations for example, can cost as much as $10,000 USD. At the other extreme, some companies will issue them for as little as $10 and require no proof of identity.
  • Certificates thus offer a higher degree of protection, but are still fairly vulnerable, either through copying or interception of transmissions. In theory, a check should be made with the appropriate Certificate Authority before the customer relies on the certification. The Certificate Authority might have already revoked the certificate. In practice this is a step that many application programs fail to take when certificates are used. Detection of the theft or interception may not take place until after some significant damage has occurred.
  • smart cards can also be used to enhance identification. Some of these are similar to magnetic strip credit cards which can be read by insertion or swiping in a card reader. Smart cards are usually used in conjunction with some other type of user input, such as name, password, or Personal Identification Number (PIN) number.
  • PIN Personal Identification Number
  • the simplest cards are low cost but may be easily duplicated. More complex smart cards have-built-in data storage facilities and even data processing facilities in the form of embedded computer chips allowing additional user information to be stored, thus providing a higher level of user verification. These tend to cost more and be more difficult to duplicate. The most secure cards have very sophisticated verification techniques but include a higher cost per individual user.
  • Another method of identification uses simple fixed system component serial numbers.
  • a computer manufacturer such as Intel
  • Intel a personal computer processor chip
  • This number can then be read to identify that particular personal computer terminal 05 . While this tends to be much more specific at identifying a terminal, it also raises privacy questions, since the terminal 05 can be identified by anyone using appropriate methods over the Internet. This has led to the creation of a software program that switches off the serial number facility. This approach to identification thus creates some concerns about privacy and also about the ability for the feature to be switched on and off without the user's knowledge.
  • IP addresses can be used for identification with systems using the Terminal Control Protocol/Internet Control Protocol (TCP/IP) communication protocol of the Internet.
  • TCP/IP Terminal Control Protocol/Internet Control Protocol
  • IP/IP Terminal Control Protocol/Internet Control Protocol
  • To be part of such a TCP/IP network requires that each computer have a unique IP address, using a specified format.
  • Each IP computer is a member of a domain. Domains can be part of another network, as a subnet or can even contain subnets.
  • Basic firewall systems 15 as seen in FIG. 2 (Prior Art) use these properties to allow or refuse access to a computer system.
  • AOLTM Internet Service Providers
  • IP address that appears when proxies are used will be that of the proxy server machine.
  • AOLTM's proxy server IP address will be the only IP form of identification for the many millions of users. This is not conducive to discrete identification.
  • Proxy servers may also be used by hackers to reach a user's computer.
  • hackers for example, can impersonate an IP address, until they find an IP address of the user's that works for their purposes.
  • Biometric identification techniques are now becoming available, such as fingerprints, voiceprints, DNA patterns, retinal scans, face recognition, etc. While the technology exists in many cases to use this type of information, it is usually not presently available in a practical form or is too expensive for many applications. Many hackers will simply view it as a challenge to find ways to copy, intercept, or fake these forms of identification.
  • a corporate website 35 (usually composed of a computer system, operating system software, webserver software and web application software) may have valuable confidential information stored on local memory such as disk 40 .
  • files 72 and 74 are usually organized logically in folders 70 that are, in turn referenced by directories 65 —all of which is stored physically on local memory such as disk 40 .
  • directories 65 all of which is stored physically on local memory such as disk 40 .
  • folders can also be placed inside other folders or directories, allowing files to be logically grouped together on a disk 40 , just as they might be stored in cardboard folders in file cabinets if they were physically kept on paper.
  • the authority to use a computer's files is based on identification of the user and the permissions and rights that have been given to that user, usually by a system administrator. For example, as seen in FIG. 3 (Prior Art) an operating system might have the scope of permissions and rights outlined in table T 1 . For these files a user might be denied any access which would be indicated at line 80 of table T 1 .
  • VPNs Virtual Private Networks
  • VPN permission levels closely resembled the same functionality and permission levels that local users of the computer or network would have had.
  • VPNs create secure links between two (or more) computers, which identify each other and then create encrypted pathways between them using sophisticated encoding techniques. Once the links have been created, they may be regarded as nearly hacker-proof for all practical purposes.
  • client identification methods that are vulnerable to attack, such as the logon name and password approach, mentioned above. Thus VPNs can be subverted by false identifications into creating confidential sessions with a hacker.
  • Firewalls are another form of network security that have been developed to address data integrity. Firewalls are usually essential requirements for any computer or computer network which can be accessed remotely.
  • a firewall is typically a computer system placed between two networks and connected to both. One of the networks is usually an internal corporate network which is reasonably secure. The other network is usually a public network, such as the Internet, which may be fraught with peril—at least from a security viewpoint.
  • the software in the firewall computer usually provides protection from certain kinds of intrusions into the internal network by:
  • filtering messages (examining the content of a message to determine whether or not to accept it).
  • computer 15 might be a firewall computer which is placed between terminals 00 and 02 on private network 10 and the Internet 25 and public communication links 30 .
  • the internal, private network is considered the “clean” network, and the external, public one the potentially “dirty” one.
  • firewalls fend off many attacks, some forms of attack can be difficult to detect, such as file deposition attacks, in which an internal computer or system is gradually filled with unwanted data which will eventually affect performance or even stop the computer or network from working. Since most current firewall technologies will detect and prevent large files being uploaded onto an internal computer or network, a knowledgeable hacker will upload a number of very small files within the size acceptable to the firewall, eventually causing the computer's disk storage space to become insufficient and the system to degrade or fail.
  • a trojan is an extreme example of a file deposition attack—the file being deposited is a program that appears useful but will in fact damage or compromise data integrity and system security when it is used.
  • “www.w3.org” is the Internet name of the computer on which the page is to be found and along with its web root directory.
  • “Addressing” is a directory found in the web root directory
  • “URL” is a directory found in the directory called “Addressing”.
  • the page being published is found in the directory “URL” and the page name is “Overview.html”.
  • This general structure applies to all URLs and enables anyone with a web browser to reach information on the Internet.
  • any website must at least have a web root directory if it is to be accessed over the Internet. This means that hackers can find any website on the Internet and access web root directories. Once a web root directory is found, hackers can usually use port scanners or other techniques to locate the vulnerable areas of a website and deploy attacks against them or copy them for illicit purposes.
  • file deposition attacks can be used to slow down, to subvert an application, or to completely disable a website or system.
  • a hacker for example, can take an initial, legitimate web page and replace it with a page of the same name that asks for improper actions or allows access to confidential data. This affects the third function of security, namely availability. While a number of technologies such as redundant computer and disk systems have been developed to maintain high availability of systems and networks, sabotage by hackers or others can bring whole systems down.
  • Implicit in these approaches is a defensive posture that tries to build computer systems and networks that are impregnable fortresses. They often fail to take into account the fact that telling an intruder it has been caught and denying access, in many cases provides valuable information to the intruder about which of its tools and attack plans are ineffective. The intruder who breaks in for amusement may actually regard these measures as a challenge. The criminal can use them for information.
  • a system for providing electronic security over a network through an extensible positive client identifier (EPCI), working with a positive information profiling system (PIPS), pseudo uniform resource locators (PURLs) to assist in providing data integrity, a virtual page publication system (VPPS), and an active security responder, (ASR).
  • EPCI extensible positive client identifier
  • PIPS positive information profiling system
  • PURLs pseudo uniform resource locators
  • ASR active security responder
  • the extensible positive client identifier (EPCI) system creates a unique client identification key and continually self-evaluates the key based on unique system signature data.
  • the positive information profiling system implements account profiles for all content and clients so that pages of information can be generated and matched to the data requested as well as the requestor.
  • the virtual page publication system VPPS of the invention does not store pages permanently in the root directory of the site but instead creates temporary web pages dynamically containing the level of information resulting from the client identification, PIPS, and PURL evaluations.
  • the virtual page is sent, (in encrypted form if this option has been selected or if this option is required by the PIPS profile), to the requestor and exists only for the time necessary to send it.
  • the active security responder (ASR) controls the overall operation of the present invention.
  • FIG. 1 is a block diagram of the present invention.
  • FIG. 2 (Prior Art) is a block diagram of prior art web page technology.
  • FIG. 3 (Prior Art) is a block diagram of typical directories and error messages of the prior art.
  • FIG. 4A is a block diagram of a standard uniform resource locator (URL) of the prior art.
  • FIG. 4B is a block diagram of a Pseudo Uniform Resource Locator (PURL) of the present invention.
  • PURL Pseudo Uniform Resource Locator
  • FIG. 5 is a block diagram of the extensible positive client identifier of the invention in operation at a client terminal.
  • FIG. 6 is a table showing the elements of a Client Identifier Key (CIK) of the present invention.
  • FIG. 7 is a block diagram showing sample numeric values for a client identifier key (CIK).
  • FIG. 8 is a flow diagram of the extensible positive client identifier of the invention.
  • FIG. 9 is a flow diagram of the PURLs processing of the invention.
  • FIGS. 10 and 11 are flow diagrams of the Virtual Page Publication System (VPPS) processing of the invention.
  • VPPS Virtual Page Publication System
  • FIG. 12 is a flow diagram of the setup for the invention at a network location.
  • FIGS. 13 and 14 are flow diagrams of the Positive Information Profiling System (PIPS) of the present invention.
  • PIPS Positive Information Profiling System
  • FIG. 15 is a block diagram of an illustrative screen display used by the PIPS processing of the invention.
  • FIG. 16 is a block diagram showing the invention configured for use by multiple different entities.
  • FIG. 17 is a block diagram of typical contents of a storage mechanism at a client terminal site.
  • FIG. 18 is a block diagram of illustrative Client Identification Keys (CIK) generated by the present invention.
  • FIG. 19 shows tables illustrating different types of security levels used by the present invention.
  • FIG. 1 an overview of the present invention is shown.
  • User computer terminals 00 and 05 are shown connected by public communications lines 30 to the Internet 25 .
  • a website 35 which is accessible over the Internet 25 .
  • website 35 is a host computer controlled by operating system 38 , webserver 37 and the present invention's Active Security Responder (ASR) 36 .
  • Disk storage 40 is shown connected to website 35 and containing only a web root 42 and, optionally, dummy website pages.
  • ASR 36 is in communication over private network lines 10 with another computer 39 , which is running the present invention's pseudo URLs—PURLs 39 a , its positive information profiling system PIPS 39 b and its virtual page publication system VPPS 39 c.
  • ASR 36 controls all the functions of web server 37 , including:
  • the operating system is Microsoft's WINDOWS NTTM system and the web server is Microsoft's INTERNET INFORMATION SERVER IISTM server using PC compatible processors or workstations.
  • the web server is Microsoft's INTERNET INFORMATION SERVER IISTM server using PC compatible processors or workstations.
  • a requesting client terminal 00 typically includes a personal computer or workstation controlled by an operating system 01 , a web browser 02 and the present invention's extensible positive client identifier software EPCI 03 communicating over public communications lines 30 with Internet 25 .
  • a terminal 00 or even a host computer 35 can be any device capable of communication over a network to send and receive data—from handheld wireless devices to computer mainframes.
  • the first request sent to it from a terminal is evaluated by ASR 36 as a public request and the appropriate public web page is returned to the client with an instruction for the client to send its client identifier key (CIK) with the next request for information.
  • CIK client identifier key
  • every page sent by ASR 36 will contain the instruction to send a CIK identifier. If the client has no EPCI 03 software installed at its terminal 00 , then ASR 36 's request for the client to send a CIK key is ignored by the other software at terminal 00 , and further communication between that terminal 00 and ASR 36 is public, although as mentioned, all the pages sent by ASR 36 will contain a “send your CIK” instruction.
  • ASR 36 of the present invention will generate a public response page for the first request from terminal 00 , along with instructions to terminal 00 to send its client identification key—CIK—with the next request. If the client at terminal 00 does have EPCI 03 software, it will examine the request to send a CIK to see if the client at terminal 00 has a relationship with the requesting server ASR 36 .
  • EPCI 03 at terminal 00 determines that there is a relationship (by verifying that in its own copy of the CK, then EPCI 03 validates itself, as described in more detail below, and provides ASR 36 with its CIK, thereby allowing ASR 36 to identify the client at terminal 00 .
  • the next and subsequent web pages will be sent to terminal 00 according to the appropriate evaluation of that client's security levels and the levels of the data requested, as will also be described in more detail below. In the embodiment shown, this also means the earlier sent public information will be refreshed with the information the security levels entitle that client to receive.
  • FIG. 6 shows the typical contents of a client identifier key (CIK) generated by EPCI software 03 .
  • Each field such as the Authorizer Personalization Key (APK) for this relationship, or the Authorizer Client Activation Key (ACAK) for this relationship, is given a numeric value by EPCI software 03 .
  • FIG. 7 illustrates a partial hypothetical CIK 120 in numeric form.
  • CIK 120 contains several items that are unique to this particular hardware and software configuration of user terminal 00 . The effect of this is that the valid client terminal 00 will self-check itself and identify itself to ASR 36 as a valid client.
  • ASR 36 will treat requests from terminal 13 according to the host server's security policy for impersonators.
  • the security policy selected uses a dummy website containing innocuous public information to satisfy any more requests from the hacker at terminal 13 . In this example, it might appear to the impersonator at terminal 13 of FIG. 1 that he or she is in communication with a website 50 , which is serving webpages stored on dummy disk 55 or on disk 40 .
  • the embodiments shown enable an entity to implement security policies which do not reveal the detection of impersonation to the impersonator.
  • the present invention augments security policies that make it appear to an interloper at almost every step that he or she has been successful, when, in fact, the opposite is true.
  • policies such as informing the intruder that he or she has been detected and denying access could be implemented as well at various stages without deviating from the spirit of the invention. For example, a user might wish to deny access in a manner visible to the interloper without indicating to the interloper that he or she has been detected.
  • FIG. 4A Prior Art
  • a standard URL 110 is shown.
  • This particular URL points to the location “overview.html” which is the address of a web page stored on disk 40 of an ordinary web server.
  • FIG. 4B shows a pseudo-URL, PURL 39 a of the present invention which appears identical to the standard URL of FIG. 4A (Prior Art), but does not point to any web page at all. Instead, it comprises a list of tasks stored in a private location to be performed in response to this request and the user's and the data's profiles.
  • the present invention includes a positive information profiling system PIPS, which enables the entity using the invention to create an account profile for all content and all clients so that data can be matched to requests for information.
  • PIPS is described in more detail below.
  • PURLs evaluation PURLS 39 a interacts with PIPS 39 b to determine what information can be sent to a particular requesting client. If the request from the user's PC terminal is a valid request for employee salary data, from a current employee, then the client profile for that employee might indicate that he or she has read-only access to his or her own salary data.
  • PURL 39 a may apply the corporation's profile for that employee to deny access to the CEO's salary, and instead supply the requesting employee's salary data. It would appear to the employee that the CEO and the employee have the same salary, when this is not so.
  • PURLs can be used to select innocuous public data to be returned to a detected interloper, so that the interloper may be led to believe he or she has successfully breached the site. For example, if the interloper requests the CEO's salary data and the entity owning the website is a publicly held company, the latest publicly known data about the CEO's salary might be displayed to the interloper, while the CEO requesting his or own salary might be given the most current values.
  • virtual page publication system VPPS 39 c of the invention provides the pages or content to be served in response to the request as determined by the PURLs 39 a and PIPs 39 b evaluations of the request and the data.
  • VPPS 39 c of the invention generates the proper responses and stores them as temporary pages or data on disk 40 , which is accessible to web server 37 .
  • the virtual page is sent to the requesting client as the requested URL (and in encrypted form, if appropriate) and deleted from the system 35 and its associated storage disk 40 .
  • the source information used to generate virtual pages is stored at a location inaccessible to the web. In the embodiment shown this is computer 39 of FIG. 1, which is connected by a private network connection 10 to host computer 35 .
  • RAM Random Access Memory
  • CPU central processing unit
  • RAM disk Random Access Memory
  • the information need not be stored at all but simply sent from RAM's internal memory.
  • RAM Random Access Memory
  • Those skilled in the art will appreciate that various types of media can be used for storing information other than those mentioned here. Magnetic Tapes, for example, or RAID disk systems, writeable CD-ROM disks, or flash memory and so on could be used.
  • VPPS 39 c keeps two security vigils. First, it checks to see if there are any files, other than the web root and the dummy security pages (if present) that are more than some specified amount of time old. If it finds such a file, VPPS 39 c can be directed by the security policy for that host to either delete such a file completely or move it to an “isolation ward” area specified by the user so it can be checked. This significantly lowers the risk of successful file deposition attacks on the web site.
  • Files other than the ones that are supposed to be there are either deleted or, in effect, moved into “quarantine” and deleted from disk 40 .
  • the invention also checks all files on disk 40 to see if the valid ones have been changed in that predefined interval as well. If they have been changed, they may have been corrupted, so they, too are either deleted or moved into isolation areas and, if so specified, the last valid content substituted for them. If the modified files so detected are those belonging to the web root or dummy pages, ASR 36 of the invention can also be guided by the security policies for the website in the handling of them. If no time has been specified by the user for the predefined interval, a default time, such as 60 seconds is used.
  • the second type of security vigil carried out by VPPS 39 c is for any new folder or directory created on the relevant storage disk(s). In the embodiments shown, these are deleted or moved to quarantine immediately, as soon as they are detected, without waiting for any interval.
  • FIG. 1 While computer system 35 here is shown as a single website host computer, the present invention can be used to manage security for several different networks and host computers at one or more locations. This is shown more clearly in FIG. 16.
  • one computer site 35 f might be a multiple website host which provides web services to companies x, y and z, over private networks 10 x , 10 y and 10 z .
  • disk 40 contains web roots x, y, and z for the respective companies.
  • Terminal 00 might be a terminal for an employee of company x
  • terminal 05 might be one for an employee of company y.
  • ASR 36 of the present invention establishes security for each company's website by building and managing the security relationships between the company information stored off the Internet and valid clients such as employees at terminal 00 making requests over the Internet.
  • each company uses its copy of ASR 36 (ASRx, ASRy or ASR z) to define its own security policies, access levels and procedures. While the examples discussed so far are directed to the Internet, those skilled in the art will appreciate that the present invention can also be used in private networks, such as internal corporate networks, or other forms of network systems.
  • ASR 36 can also be installed on several machines at different sites for handling security for just one entity, as well.
  • ASR 36 creates no visible difference to the outside world.
  • the websites or locations using its services will generally appear the same to external clients or requestors as they would if the invention were not installed.
  • each entity using the invention does need to allocate security levels to information sources.
  • this allocation of security levels is done through the profiling system PIPS 39 .
  • each entity x, y, or z is able to create its own secure profile information on its own systems which are not directly exposed to the Internet.
  • each entity using the invention installs ASR 36 software at the website host it is using (if the ASR 36 software is not already present) and the invention's EPCI 03 software at each client terminal 00 which is to be allowed access beyond the publicly available data.
  • ASR 36 a block diagram of the setup of ASR 36 is illustrated.
  • the ASR 36 software is setup for this entity.
  • each entity's copy of ASR 36 is given an authorized server activation key (ASAK).
  • An ASAK is a unique string of 48 or more characters supplied with the product license for that entity. It is used to activate the configuration of the invention for that entity.
  • a request to enter the ASAK is made when the ASR 36 product is first used.
  • a further request is made of the purchasing entity to enter 30 or more characters of the entity's own devising. This is called the client confidence key, or CCK. In the embodiments shown, this could be any kind of key which the corporate entity believes will assist in uniquely identifying it.
  • this client confidence key CCK is encrypted as it is entered and kept in encrypted form by ASR 36 . This means that ASR 36 does not “know” the unencrypted form of the CCK, and thus minimizes the risk of “backdoor” access to protected information even by the licensor of the product.
  • ASR 36 will generate its own authorized personalization key (APK) for this corporate entity by combining the ASAK and the CCK.
  • APK authorized personalization key
  • [0099] provide a unique basis for encryption for communication with clients of that ASR 36 , if desired.
  • the user entity here corporation x, y or z
  • the user entity is required to store a copy of its CCK and ASAK for use if there is ever a need to re-install the system. If the APK that is generated for an ASR 36 is ever changed, then none of the clients will have privileged access until they have all been issued with new ACAKs based on the changed APK.
  • an authorized client activation key ACAK must be generated for each client and will contain the APK and a unique identifier for that client.
  • a new client is issued client identification key, (CIK) generation software and is also given its unique ACAK.
  • CIK client identification key
  • the client When the client first runs the CIK generation software at its client terminal 00 , it is prompted to enter its ACAK.
  • the CIK generation software at that client terminal 00 then creates a unique CIK 120 for that client terminal and exits. No further client activation is required.
  • this procedure may be repeated by the user for relationships with any number of different servers such as ASRy or ASRz of FIG. 16.
  • the same CIK generation software must be used but the ACAK for each different server ASR 36 must be different.
  • FIG. 17 some of the elements that can be used by the present invention's EPCI 03 software installed at terminal 00 to create a CIK 120 are shown.
  • a client who is using the EPCI 03 software at his or her client terminal 00 is also using a personal computer C 00 as his or her terminal 00 .
  • a disk D 00 is shown which is attached to personal computer C 00 .
  • Disk D 00 contains a directory Dir 00 , which contains information about this particular computer C 00 and its installed hardware and software components. For example, at line 001 information such as the central processor unit (CPU) serial number of computer C 00 is stored, along with identification about the latest Read-Only-Memory (ROM) Revision made to that CPU.
  • CPU central processor unit
  • ROM Read-Only-Memory
  • this example shows at line 002 that computer C 00 has 32 megabytes of memory built-in and has configured its memory management to treat that as 128 megabytes of virtual memory.
  • the volume serial number of disk D 00 is given at line 003 , along with an indicator of its type—HD for hard drive, as distinguished from removable media drives, such as floppy disk drives.
  • Dir 00 also indicates at line 004 that this computer is using Operating System version 9.6 which was installed on Nov. 9, 1999.
  • Dir 00 also shows, at line 005 that sound capability from ABC sound has been installed, with it version number and serial number.
  • the particulars of the type of video display are given.
  • the CIK generation software of the present invention makes use of some or all of this kind of information and more to create a client identification key, CIK 120 , that is unique to this particular user's installation, as illustrated in FIG. 6's Table T 2 .
  • the first access from this client terminal 00 is usually a public access, in which terminal 00 does not send a CIK.
  • ASR 36 decides whether a relationship with terminal 00 has been established by the setup processes described above. If a relationship has been established, ASR 36 will automatically refresh the web page previously sent as a public page to obtain the correct CIK 120 from the client at terminal 00 .
  • EPCI 03 receives a “send your CIK” request from ASR 36 running on the host/server machine. In the embodiments shown, every request for a page component is answered within an appropriate version of that page component plus a request for the client to send its CIK 120 . The request also contains the APK for that particular server ASR 36 .
  • EPCI 03 checks to see if there is a relationship with that particular server ASR 36 . It does so by taking the APK sent with the request and comparing it with its own copy contained in the client's CIK file created by the CIK generation software.
  • a client will have the APK from each server ASR 36 that provided it with an ACAK. If there is no match, and therefore, no relationship, EPCI 03 does nothing, at step 210 . However, if there is a relationship, then EPCI 03 proceeds to step 215 to carry out any instructions from the host running the requesting ASR 36 .
  • the instructions can be as simple as “reply with CIK”, to commands to run several programs or tasks and then reply with CIK. That is, the step of carrying out instructions from ASR 36 sent from the host server machine can be used to install new software at terminal 00 , run other software, delete software or data, and so on. This step is not restricted solely to security checking. This feature provides entities using ASR 36 with significant options for communicating with or controlling the remote terminals.
  • a system signature is some extensible combination of the unique information stored locally at terminal 00 . That is, some combination of the information described in FIG. 17 about the particular configuration of terminal 00 is used to create the system signature.
  • the system signature for this kind of computer C 00 might include the serial number of the hard drive shown at line 003 , the installation date of the operating system shown at line 004 , the sound card serial number shown at line 005 , and the version number of the DRAWPGM, shown at line 008 .
  • Different types of computers might have different system signatures.
  • EPCI 03 provides positive identification of the client machine being used.
  • a system administrator might wish to further authenticate the person using that machine by adding a logon and password field to the system signature, for consistency with other internal procedures.
  • other elements can be used by the present invention to form a unique system signature, such as smart card data or biometric identifiers, and so on.
  • EPCI 03 checks at step 225 to see if the new system signature is the same as the old system signature stored in its CIK file. If it is, a new client identification key CIK 120 is created at step 230 , indicating that the self-evaluation done by EPCI 03 on this machine passed the test and at step 240 , the newly created CIK 120 is passed to ASR 36 . If the system signature is not the same as the old one from a valid client, a new CIK 120 is created which includes a pass or fail indicator (see CIKSTATUS at Table T 2 of FIG. 6), and new CIK 120 is passed to ASR 36 at the host. Note that this self-evaluation does not notify anyone at terminal 00 of the pass or fail status. In other words, the self-evaluation is done “silently” as it were.
  • FIG. 18 examples of passing and failing CIK's 120 are shown.
  • the volume serial number of the hard drive D 00 attached to terminal 00 is shown in system signature S 1 as 890765, which matches the one shown in FIG. 17.
  • a PASS indicator E 1 is inserted into new CIK 120 - 1 .
  • ASR 36 receives a request for a data locator—in the embodiment shown, in uniform resource locator format—and CIK 120 from terminal 00 , it passes that information to pseudo uniform resource locator processing PURLS 39 a .
  • the flow diagram of FIG. 9 illustrates the processing performed by PURLS 39 a .
  • PURLS 39 a for this entity receives the client request from terminal 00 , in this example.
  • PURLS 39 a extracts the client's CIK and passes that to the present invention's PIPS 39 b program.
  • PIPS 39 b performs a number of identity checks at decision blocks 450 , 455 , 460 , 480 , and 485 , checking the CIKSTATUS, system signature, ACAK, Session ID, and APK information. In the embodiments shown, reliance solely on the client CIK may not be sufficient for detecting a skilled hacker who learns how to construct a CIK. Checking other items such session id as well, provides additional safeguards. If the information fails any of these checks, PIPS 39 b proceeds to step 465 . At 465 , since identification has failed on one or more of the checks, the security logs of the present invention are updated. At step 470 the site identity is set to public for this response by PIPS 39 b . Finally, now that an identity problem has been logged, PIPS 39 b carries out the security policy actions which the user has specified for the particular type of error detected.
  • PIPS 39 b at step 490 updates its audit logs, and then evaluates client group, security group and security level data at step 495 .
  • FIG. 19 it can be seen that the present invention allows each page and each client to have a predefined security level.
  • the security level of a page must be matched by that of its intended recipient in order for the page to be published.
  • FIG. 19 some examples of access levels are shown.
  • Inclusive access table IN 00 illustrates a structure in which a higher level of security automatically includes all lower levels. Thus if a client has access level 2 , in table IN 00 , it will automatically have access to levels 0 and 1 as well.
  • There a client may have access to only one or two levels, but not to any others. For example, a client with access level B only has access to page level B. A client with access level AD has access only to page levels A and D. These two types could also be used in various combination to provide additional security options.
  • FIG. 15 a screen display of an account profile for a client is shown.
  • a screen display for each page or section of the website can be used to create an account profile of the data secured by the present invention.
  • the requested PURL sent by the client terminal 00 is extracted from the request and, at step 270 , is sent back to PIPS 39 b for processing.
  • This portion of PIPS 39 b processing is diagrammed in the flow diagram of FIG. 14.
  • PIPS 39 b selects the task(s) (contained in the request) that meets the client group, security group and security level data.
  • a PURL in the present invention is not the address of data or pages, as ordinary URLs are, but identifies a list of tasks to be performed.
  • PIPS 39 b carries out those task(s) and constructs a list of information components which will eventually be displayed in a web page or similar result and then, at step 515 , PIPS 39 b passes this information back to PURLS 39 a.
  • PURLS 39 a receives this information and finds this list of valid components at step 275 .
  • this list is passed to VPPS 39 c of the present invention.
  • virtual page publication system VPPS 39 c processing is shown.
  • the list of valid components is received from PURLS 39 a .
  • VPPS prepares a temporary file containing one or more web pages and sends a copy of that temporary file to the web root. (Web root 42 of disk 40 in FIG. 1).
  • VPPS 39 c passes the name of that temporary file to the webserver (webserver 37 in FIG. 1), which will cause that page(s) to be delivered to the client as the requested data.
  • VPPS 39 c deletes the temporary file from the web root as soon as the data has been sent on its way.
  • VPPS 39 c waits, at step 580 , some predefined interval specified by the user.
  • the interval is usually some small multiple of the “ping time” for an average use.
  • the TCP/IP protocol allows a terminal to send data to a server and measure the time it takes to get to the server, usually a few milliseconds. If the ping time is 10 milliseconds, the interval specified by the user to VPPS 39 c might be 20 milliseconds. Usually it is an interval that is just long enough to let a valid message go through and the temporary file stored on the web root to be deleted.
  • VPPS 39 c checks at step 590 of FIG. 11 to see if any new file has been created on the disk containing web root 42 . If VPPS 39 c determines that a new file has been created and stored there it is automatically assumed to be suspect. Depending on the security policy for the website, VPPS 39 c will either delete the file completely at Step 595 or move it to a “safe” area, isolated from both the Internet network and the user's internal network. In this way, file deposition attacks can usually be detected and dealt with immediately. VPPS 39 c also checks to see if any of the legitimate files on the web root have been modified.
  • the present invention ensures that all users of the network or system it monitors are managed and monitored throughout the duration of their sessions with the server at the host computer and that the information provided to them is appropriate to their pre-defined status.
  • EPCI 03 authenticates the CIK at every access and web page component.
  • the system as a whole is scalable for any number of client entities or number of relationships. It also verifies its own integrity, and reports success or failure through the audit and security logs. It can be used in combination with other security measures such as VPNs, Secure Socket Layer (SSL) technology which encrypts data sent between client and server computers, and X.509 Digital Certificates or Digital Ids.
  • SSL Secure Socket Layer
  • the client identification key is self-checking and aware of its environs so it ensures that if the client identification key is copied and used on another terminal, it will fail, and report the type of failure.
  • the client identification key responds only to a server with which the client has a known and agreed upon relationship. With the embodiments shown, clients do not need to take special actions such as using logon or passwords.
  • ASR 36 requires that a client must first be enrolled on the secure system by creating a client account as described above.
  • the present invention is implemented in the C++, VISUAL BASICTM (from Microsoft, Inc.), and POWERBASICTM (from POWERBASICTM Inc. in Carmel, Calif.) languages, but those skilled in the art will appreciate that it could also be implemented in other languages such as Perl, C, Java and so on.
  • the embodiments shown are implemented in software, part or all of the invention could also be embodied in firmware or circuitry, if desired.
  • the embodiments shown are directed to use with networks and systems using the TCP/IP protocol, other network or system protocols could be used.

Abstract

A system and method that provides electronic security over a network through an extensible positive client identifier (EPCI), working with a positive information profiling system (PIPS), pseudo uniform resource locators (PURLs) to assist in providing data integrity, a virtual page publication system (VPPS), and an active security responder, (ASR). The extensible positive client identifier (EPCI) creates a unique client identification key and continually self-evaluates the key based on unique system signature data. The positive information profiling system implements account profiles for all content and clients so that pages of information can be generated and matched to the data requested as well as the requester. The virtual page publication system VPPS of the invention does not store pages permanently in the root directory of the site but instead creates temporary web pages dynamically containing the level of information resulting from the client identification, PIPS, and PURL evaluations. The virtual page is sent, (in encrypted form if this option has been selected or if this option is required by the PIPS profile), to the requestor and exists only for the time necessary to send it. The active security responder (ASR) controls the overall operation of the present invention.

Description

    BACKGROUND OF THE INVENTION TECHNICAL FIELD
  • The present invention relates generally to the field of providing security for a location in a network and more particularly to an extensible positive client identification system and method. [0001]
  • BACKGROUND
  • The Worldwide Web (web), web browser, and email technologies have transformed the Internet public telecommunications network into a tool for everyday use. While businesses have used a variety of computer and private network technologies for several decades, often creating valuable databases and internal files in the process, web technologies have now made it possible for businesses to use such corporate data on the Internet for competitive advantage. Commercial transactions that used to be done through face to face meetings and negotiations, for example, can now be done electronically via the Internet—at least in theory. In practice, the more significant the transactions are, and the more sensitive the data involved, the more likely it is that security on the Internet (or any network) becomes a problem. [0002]
  • Ideally, electronic security addresses three requirements: [0003]
  • 1. Confidentiality—the prevention of the unauthorized disclosure of information; [0004]
  • 2. Integrity—the prevention of the unauthorized modification of information; and [0005]
  • 3. Availability—the prevention of the unauthorized withholding of information. [0006]
  • In practice, current methods tend to fall short of the degree of certainty or comfort needed in one or more of these areas for many commercial or higher risk transactions. [0007]
  • Confidentiality, for example, begins by identifying the requestor of confidential information. This, in turn, means not only identifying a valid requester, but also detecting when an imposter or thief is impersonating a valid requester to gain access to confidential information. In many cases it is also true that a valid requestor may only be authorized to have access to a particular level of information. An employee database, for example, which contains salary information may have several different levels of access. An individual employee may only be authorized to access his or her salary information, while the head of the personnel department may have access to all salary data. Non-employees may be denied access to any employee data—hence the importance of identification. [0008]
  • Data integrity is required to safeguard the data being requested. Computer hackers (those who seek to break through security safeguards either for amusement or theft), may try to corrupt data at the host computer by seeding computer viruses (programs that destroy files and data at the host site), corrupting data, replacing data with false information or by depositing “trojans”—software that appears to be useful but in fact does harm. Hackers can also try to intercept and corrupt data as it is being transmitted to a remote site. After transmission, a hacker may try to corrupt the data stored at the remote site. [0009]
  • Availability means simply that information should not be withheld improperly when it is requested. Many factors can affect availability over a network, such as hardware malfunction, software malfunction, data corruption, or the failure or slowing down of communications links. [0010]
  • While there are some existing measures and tools designed to address computer and network security, many of these have significant weaknesses. For example, one of the most popular methods of user identification for computers and networks is the use of a logon name and password. As seen in FIG. 2 (Prior Art), a computer user at a [0011] personal computer terminal 00 may want to connect over private network lines 10, to communicate with another user at terminal 02 within the private network. Computer software allows the user at terminal 00 to log onto the computer by using a dialogue screen that requests his or her user name and password. For a hacker to “crack” or break this kind of system thus requires knowledge of a valid user name and password combination.
  • The logon name and password approach has a number of weaknesses. First, logon names are usually very easy to discover. Many organizations select a standard format for them based on the user's real identity. Fred Smith, for example, may be given a logon name of “fsmith” or “freds”. A hacker familiar with a user's real name may find it easy to deduce this kind of logon name. Many computer systems that require logon names also have default settings that are used when the system is first configured. Many users simply keep these default account names. Thus, a hacker familiar with the NT™ operating system provided by Microsoft, Inc. of Redmund Wash., might try the ‘Administrator’ account. Default account names and passwords greatly reduce the amount of work required for the hacker to gain illicit entry to a system. Hackers may use software attacks to obtain passwords by copying password files. [0012]
  • Users often reveal their passwords accidentally by writing them down or by being observed during password entry. Some may deliberately disclose their passwords to a colleague so he or she can carry out a task on the user's behalf. Others will use the names of pets, family members, birthdays, etc., in order to make them memorable. [0013]
  • Unfortunately, this also makes them easier for others to guess. Most computer systems allow an administrator to define the type of passwords to be used. However, the more complex the requirements are, the more likely the user is to write it down and display it conspicuously near the terminal, simply because the user cannot remember it. [0014]
  • Many organizations have relied on the logon name and password approach for their internal networks, because for most of these organizations, most potential hackers are internal employees who are not likely to do significant damage to the corporation. However, as these organizations allow access from outside the company, using the Internet [0015] 25 of FIG. 2 (Prior Art)—or other networks—sole reliance on logon names and passwords can ultimately lead to a total breach of security and all its consequences.
  • Some corporations have also used hardware keys (also known as “dongles”) connected to each computer terminal to identify users and prevent unauthorized access. While this is an improvement over the simple logon name and password approach, these can usually be circumvented fairly easily by a hacker who examines what the hardware key does and emulates it in software. [0016]
  • Digital Identifiers (Digital IDs), Digital Certificates and Trusted Third Party Certificate Authorities (TTPCA) are more sophisticated methods used in the industry to enhance identification and security over the Internet. There are various industry standards associated with this technology, the most notable at this time being ANSI standard X.509 version 3. For the purposes of this discussion, the terms Digital IDs and Digital Certificate are used interchangeably. A Digital Certificate is a series of characters containing an identifier and usually other verification information. The certificate or id may be stored in a computer file—as seen in FIG. 2 (Prior Art), at [0017] disk 03 connected with a computer terminal 02, or on some other memory device such as a smart card. When the id is read by the appropriate software it is possible to use that id for identification purposes. Usually these ids are constructed in such a way that if they are tampered with and any of the characters are changed the reading software will confirm this and inform the requesting software. Thus, the techniques currently in use are sophisticated enough to insure that a certificate is complete and unaltered. Thus, they also provide an excellent basis for encryption of information.
  • However, digital certificates can be copied from a [0018] computer terminal 02 such as the one shown in FIG. 2 (Prior Art), and used to impersonate the user. They can also be stolen remotely while the user is using the Internet. For example, a hacker at terminal 13 can use the Internet 25 and communications networks 30 and 10 to find and copy a certificate stored on a disk at personal computer terminal 02.
  • Trusted Third Party Certificate Authorities (TITPCA) can be used to create and issue digital certificates for a company. To obtain a certificate from a certificate authority usually requires proof of identity. The certifying authority then uses its own digital certificate to generate one for the requestor. The degree of stringency and cost varies from authority to authority. At the highest levels of security, it can take several months to obtain one, and require high levels of proof of identity as well as expense. Certificates for large corporations for example, can cost as much as $10,000 USD. At the other extreme, some companies will issue them for as little as $10 and require no proof of identity. [0019]
  • If a user holds a certificate and believes it may have been stolen or compromised then it informs the certificate authority which will usually revoke the user's current certificate and issue it another one. Certificates thus offer a higher degree of protection, but are still fairly vulnerable, either through copying or interception of transmissions. In theory, a check should be made with the appropriate Certificate Authority before the customer relies on the certification. The Certificate Authority might have already revoked the certificate. In practice this is a step that many application programs fail to take when certificates are used. Detection of the theft or interception may not take place until after some significant damage has occurred. [0020]
  • As mentioned above, smart cards can also be used to enhance identification. Some of these are similar to magnetic strip credit cards which can be read by insertion or swiping in a card reader. Smart cards are usually used in conjunction with some other type of user input, such as name, password, or Personal Identification Number (PIN) number. The simplest cards are low cost but may be easily duplicated. More complex smart cards have-built-in data storage facilities and even data processing facilities in the form of embedded computer chips allowing additional user information to be stored, thus providing a higher level of user verification. These tend to cost more and be more difficult to duplicate. The most secure cards have very sophisticated verification techniques but include a higher cost per individual user. [0021]
  • Another method of identification uses simple fixed system component serial numbers. In the example of FIG. 2 (Prior Art), a computer manufacturer, (such as Intel) of a personal computer processor chip such as that shown as [0022] terminal 05, may have embedded a serial number in the processor. This number can then be read to identify that particular personal computer terminal 05. While this tends to be much more specific at identifying a terminal, it also raises privacy questions, since the terminal 05 can be identified by anyone using appropriate methods over the Internet. This has led to the creation of a software program that switches off the serial number facility. This approach to identification thus creates some concerns about privacy and also about the ability for the feature to be switched on and off without the user's knowledge.
  • Along similar lines, Internet Protocol (IP) addresses can be used for identification with systems using the Terminal Control Protocol/Internet Control Protocol (TCP/IP) communication protocol of the Internet. To be part of such a TCP/IP network requires that each computer have a unique IP address, using a specified format. Each IP computer, in turn, is a member of a domain. Domains can be part of another network, as a subnet or can even contain subnets. These IP properties are exposed during every network access. [0023] Basic firewall systems 15, as seen in FIG. 2 (Prior Art) use these properties to allow or refuse access to a computer system. Computer users of the AMERICA ONLINE™ (AOL™) internet service, from America Online, Inc. of Dulles, Va., for example, are all members of the AOL™ domain.
  • Many companies and Internet Service Providers (ISP) such as AOL™ only allow Internet access through a proxy server. The IP address that appears when proxies are used will be that of the proxy server machine. For users of AOL, for example, AOL™'s proxy server IP address will be the only IP form of identification for the many millions of users. This is not conducive to discrete identification. [0024]
  • Proxy servers may also be used by hackers to reach a user's computer. Hackers, for example, can impersonate an IP address, until they find an IP address of the user's that works for their purposes. [0025]
  • Biometric identification techniques are now becoming available, such as fingerprints, voiceprints, DNA patterns, retinal scans, face recognition, etc. While the technology exists in many cases to use this type of information, it is usually not presently available in a practical form or is too expensive for many applications. Many hackers will simply view it as a challenge to find ways to copy, intercept, or fake these forms of identification. [0026]
  • In addition to the identification problems outlined above, companies seeking to use the Internet and the web for commercial purposes, also need to control the creation, modification and deletion of data that is requested or used on a website or network location. In the example shown in FIG. 2 (Prior Art), a corporate website [0027] 35 (usually composed of a computer system, operating system software, webserver software and web application software) may have valuable confidential information stored on local memory such as disk 40.
  • Computers hold programs and data in objects usually called files or data sets. As seen in FIG. 3 (Prior Art), files [0028] 72 and 74 are usually organized logically in folders 70 that are, in turn referenced by directories 65—all of which is stored physically on local memory such as disk 40. In most file structures provided by present day operating systems, folders can also be placed inside other folders or directories, allowing files to be logically grouped together on a disk 40, just as they might be stored in cardboard folders in file cabinets if they were physically kept on paper. The authority to use a computer's files is based on identification of the user and the permissions and rights that have been given to that user, usually by a system administrator. For example, as seen in FIG. 3 (Prior Art) an operating system might have the scope of permissions and rights outlined in table T1. For these files a user might be denied any access which would be indicated at line 80 of table T1.
  • If this same security profile typing is applied to web pages, as it is by many websites today, a requestor without the proper permissions receives messages such as those shown in FIG. 3 (Prior Art) at [0029] 100 and 105. In some instances, messages such as these may alert a hacker to the kinds of information that require more rights, and provoke him or her into spending more time attempting to gain illicit access.
  • One approach to data integrity is provided by Virtual Private Networks (VPNs), which were conceived as a method of providing more secure remote access to users. VPN permission levels closely resembled the same functionality and permission levels that local users of the computer or network would have had. VPNs create secure links between two (or more) computers, which identify each other and then create encrypted pathways between them using sophisticated encoding techniques. Once the links have been created, they may be regarded as nearly hacker-proof for all practical purposes. However, while VPNs can create secure links between computers and/or terminals on a network, the link may be based on client identification methods that are vulnerable to attack, such as the logon name and password approach, mentioned above. Thus VPNs can be subverted by false identifications into creating confidential sessions with a hacker. [0030]
  • Firewalls are another form of network security that have been developed to address data integrity. Firewalls are usually essential requirements for any computer or computer network which can be accessed remotely. A firewall is typically a computer system placed between two networks and connected to both. One of the networks is usually an internal corporate network which is reasonably secure. The other network is usually a public network, such as the Internet, which may be fraught with peril—at least from a security viewpoint. The software in the firewall computer usually provides protection from certain kinds of intrusions into the internal network by: [0031]
  • denying service, [0032]
  • closing off access to internal ports or computers, [0033]
  • denying access to certain protocols, or [0034]
  • filtering messages (examining the content of a message to determine whether or not to accept it). [0035]
  • In FIG. 2 (Prior Art), [0036] computer 15 might be a firewall computer which is placed between terminals 00 and 02 on private network 10 and the Internet 25 and public communication links 30. The internal, private network is considered the “clean” network, and the external, public one the potentially “dirty” one.
  • While firewalls fend off many attacks, some forms of attack can be difficult to detect, such as file deposition attacks, in which an internal computer or system is gradually filled with unwanted data which will eventually affect performance or even stop the computer or network from working. Since most current firewall technologies will detect and prevent large files being uploaded onto an internal computer or network, a knowledgeable hacker will upload a number of very small files within the size acceptable to the firewall, eventually causing the computer's disk storage space to become insufficient and the system to degrade or fail. [0037]
  • A trojan is an extreme example of a file deposition attack—the file being deposited is a program that appears useful but will in fact damage or compromise data integrity and system security when it is used. [0038]
  • No matter how effective a firewall or VPN connection is, it is likely that a determined intruder can find a way to access a website for nefarious purposes. In a sense, the protocol of the Internet itself abets this, particularly its HTTP (Hypertext Transfer Protocol) and related protocols. This is the method used by websites on the worldwide web to publish pages to a web browser at a user's personal computer terminal. Uniform Resource Locators (URLs) are used to implement this. As seen in FIG. 4 (Prior Art), block [0039] 110, the URL describes where to find and how to use a resource on the Internet. In the example of FIG. 4 (Prior Art), the “http://” indicates that the resource must be accessed using http protocol. “www.w3.org” is the Internet name of the computer on which the page is to be found and along with its web root directory. “Addressing” is a directory found in the web root directory, and “URL” is a directory found in the directory called “Addressing”. The page being published is found in the directory “URL” and the page name is “Overview.html”. This general structure applies to all URLs and enables anyone with a web browser to reach information on the Internet. Thus, any website must at least have a web root directory if it is to be accessed over the Internet. This means that hackers can find any website on the Internet and access web root directories. Once a web root directory is found, hackers can usually use port scanners or other techniques to locate the vulnerable areas of a website and deploy attacks against them or copy them for illicit purposes.
  • As mentioned above, file deposition attacks can be used to slow down, to subvert an application, or to completely disable a website or system. A hacker, for example, can take an initial, legitimate web page and replace it with a page of the same name that asks for improper actions or allows access to confidential data. This affects the third function of security, namely availability. While a number of technologies such as redundant computer and disk systems have been developed to maintain high availability of systems and networks, sabotage by hackers or others can bring whole systems down. [0040]
  • Most current security systems and methods also embody some assumptions about would-be intruders. For example, many systems will deny access to an intruder once he or she has been detected. While the system designers know that this often does not deter an intruder, but may actually provoke one, an assumption of this approach is that the intruder who has been detected knows he or she will have to work harder and might give up to search for other prey. In present-day cryptography, for example, it is assumed that most ciphers or encryption techniques can be decoded or decrypted, given a sufficient amount of time, money, and computer “horsepower.” In other words, it is extremely difficult to make a security system unbreakable, but it can be made more difficult and costly to break. Implicit in these approaches is a defensive posture that tries to build computer systems and networks that are impregnable fortresses. They often fail to take into account the fact that telling an intruder it has been caught and denying access, in many cases provides valuable information to the intruder about which of its tools and attack plans are ineffective. The intruder who breaks in for amusement may actually regard these measures as a challenge. The criminal can use them for information. [0041]
  • It is an object of the present invention to provide a security system that positively identifies an authorized client. [0042]
  • It is another object of the present invention to provide a system for detecting interlopers. [0043]
  • SUMMARY
  • These and other objects are achieved by a system for providing electronic security over a network through an extensible positive client identifier (EPCI), working with a positive information profiling system (PIPS), pseudo uniform resource locators (PURLs) to assist in providing data integrity, a virtual page publication system (VPPS), and an active security responder, (ASR). The extensible positive client identifier (EPCI) system creates a unique client identification key and continually self-evaluates the key based on unique system signature data. The positive information profiling system implements account profiles for all content and clients so that pages of information can be generated and matched to the data requested as well as the requestor. The virtual page publication system VPPS of the invention does not store pages permanently in the root directory of the site but instead creates temporary web pages dynamically containing the level of information resulting from the client identification, PIPS, and PURL evaluations. The virtual page is sent, (in encrypted form if this option has been selected or if this option is required by the PIPS profile), to the requestor and exists only for the time necessary to send it. The active security responder (ASR) controls the overall operation of the present invention. [0044]
  • BRIEF DESCRIPTION OF THE DRAWINGS.
  • FIG. 1 is a block diagram of the present invention. [0045]
  • FIG. 2 (Prior Art) is a block diagram of prior art web page technology. [0046]
  • FIG. 3 (Prior Art) is a block diagram of typical directories and error messages of the prior art. [0047]
  • FIG. 4A (Prior Art) is a block diagram of a standard uniform resource locator (URL) of the prior art. [0048]
  • FIG. 4B is a block diagram of a Pseudo Uniform Resource Locator (PURL) of the present invention. [0049]
  • FIG. 5 is a block diagram of the extensible positive client identifier of the invention in operation at a client terminal. [0050]
  • FIG. 6 is a table showing the elements of a Client Identifier Key (CIK) of the present invention. [0051]
  • FIG. 7 is a block diagram showing sample numeric values for a client identifier key (CIK). [0052]
  • FIG. 8 is a flow diagram of the extensible positive client identifier of the invention. [0053]
  • FIG. 9 is a flow diagram of the PURLs processing of the invention. [0054]
  • FIGS. 10 and 11 are flow diagrams of the Virtual Page Publication System (VPPS) processing of the invention. [0055]
  • FIG. 12 is a flow diagram of the setup for the invention at a network location. [0056]
  • FIGS. 13 and 14 are flow diagrams of the Positive Information Profiling System (PIPS) of the present invention. [0057]
  • FIG. 15 is a block diagram of an illustrative screen display used by the PIPS processing of the invention. [0058]
  • FIG. 16 is a block diagram showing the invention configured for use by multiple different entities. [0059]
  • FIG. 17 is a block diagram of typical contents of a storage mechanism at a client terminal site. [0060]
  • FIG. 18 is a block diagram of illustrative Client Identification Keys (CIK) generated by the present invention. [0061]
  • FIG. 19 shows tables illustrating different types of security levels used by the present invention. [0062]
  • DETAILED DESCRIPTION OF THE INVENTION
  • In FIG. 1, an overview of the present invention is shown. [0063] User computer terminals 00 and 05 are shown connected by public communications lines 30 to the Internet 25. Also shown is a website 35, which is accessible over the Internet 25. In the embodiment shown, website 35 is a host computer controlled by operating system 38, webserver 37 and the present invention's Active Security Responder (ASR) 36. Disk storage 40 is shown connected to website 35 and containing only a web root 42 and, optionally, dummy website pages. ASR 36 is in communication over private network lines 10 with another computer 39, which is running the present invention's pseudo URLs—PURLs 39 a, its positive information profiling system PIPS 39 b and its virtual page publication system VPPS 39 c.
  • In the embodiment shown in FIG. 1, [0064] ASR 36 controls all the functions of web server 37, including:
  • intercepting all requests for web pages and web page components; [0065]
  • examining the request for evidence of interception or impersonation; [0066]
  • validating the client and evaluating the client's profile; [0067]
  • evaluating a page profile for the requested PURL; [0068]
  • carrying out any actions associated with the client profile or page profile; [0069]
  • preparing and filing any logging or auditing information as required; [0070]
  • publishing the information selected by the above process as the requested web [0071] page using VPPS 39 c, and
  • using VPPS to examine the web server file system on [0072] storage disk 40 for recently created files to be either deleted or stored in an “isolation ward” for further examination.
  • In the embodiments shown, the operating system is Microsoft's WINDOWS NT™ system and the web server is Microsoft's INTERNET INFORMATION SERVER IIS™ server using PC compatible processors or workstations.Those skilled in the art will appreciate that other computers, storage systems, operating systems and web servers or networking techniques could be used without deviating from the spirit of the invention. [0073]
  • Turning briefly to FIG. 5, it can be seen that a requesting [0074] client terminal 00 typically includes a personal computer or workstation controlled by an operating system 01, a web browser 02 and the present invention's extensible positive client identifier software EPCI 03 communicating over public communications lines 30 with Internet 25. Those skilled in the art will appreciate that a terminal 00 or even a host computer 35 can be any device capable of communication over a network to send and receive data—from handheld wireless devices to computer mainframes.
  • Similarly, those skilled in the art appreciate that the functions provided by operating systems and web browsers can be replaced by other software, such as custom software without deviating from the spirit of the present invention. Similarly, while the embodiments shown assume the use of a public Internet network, those skilled in the art will appreciate the it the present invention can also be used in private, internal networks such as internal Wide Area Networks or Local Area Networks (WANs and LANs), or intranets or extranets. [0075]
  • Returning to FIG. 1, the first request sent to it from a terminal is evaluated by [0076] ASR 36 as a public request and the appropriate public web page is returned to the client with an instruction for the client to send its client identifier key (CIK) with the next request for information. In the embodiments shown, every page sent by ASR 36 will contain the instruction to send a CIK identifier. If the client has no EPCI 03 software installed at its terminal 00, then ASR 36's request for the client to send a CIK key is ignored by the other software at terminal 00, and further communication between that terminal 00 and ASR 36 is public, although as mentioned, all the pages sent by ASR 36 will contain a “send your CIK” instruction.
  • If the [0077] EPCI 03 software has been installed at the user terminal 00, ASR 36 of the present invention will generate a public response page for the first request from terminal 00, along with instructions to terminal 00 to send its client identification key—CIK—with the next request. If the client at terminal 00 does have EPCI 03 software, it will examine the request to send a CIK to see if the client at terminal 00 has a relationship with the requesting server ASR 36.
  • If the [0078] EPCI 03 software at terminal 00 determines there is no relationship, then the request to send its CIK identifier is ignored by EPCI 03 and all further communication between ASR 36 and terminal 00 is handled on a public basis, even though in the embodiment shown, ASR 36 continues to send a request for terminal 00 to send its CIK identifier.
  • If [0079] EPCI 03 at terminal 00 determines that there is a relationship (by verifying that in its own copy of the CK, then EPCI 03 validates itself, as described in more detail below, and provides ASR 36 with its CIK, thereby allowing ASR 36 to identify the client at terminal 00. Once both have established that a relationship exists between them, the next and subsequent web pages will be sent to terminal 00 according to the appropriate evaluation of that client's security levels and the levels of the data requested, as will also be described in more detail below. In the embodiment shown, this also means the earlier sent public information will be refreshed with the information the security levels entitle that client to receive.
  • FIG. 6 shows the typical contents of a client identifier key (CIK) generated by [0080] EPCI software 03. Each field, such as the Authorizer Personalization Key (APK) for this relationship, or the Authorizer Client Activation Key (ACAK) for this relationship, is given a numeric value by EPCI software 03. FIG. 7 illustrates a partial hypothetical CIK 120 in numeric form.
  • Returning to FIG. 1, if the user terminal making the request is a valid one, its [0081] EPCI 03 software will self-check its client identifier key ( CIK 120) as described in more detail below), and return a newly generated CIK 120 to ASR 36. In the embodiment shown, CIK 120 contains several items that are unique to this particular hardware and software configuration of user terminal 00. The effect of this is that the valid client terminal 00 will self-check itself and identify itself to ASR 36 as a valid client. If a hacker or interloper has stolen the EPCI 03 software, and installed it on terminal 13, that same software will generate a new CIK for terminal 13 which uses fields that are unique to the hardware and software configuration of terminal 13, compare it to the previous CIK created for terminal 00's unique hardware and software configuration and silently identify itself to ASR 36 as a hacker or impersonator by setting such an indicator in the new CIK it generates. At that point, ASR 36 will treat requests from terminal 13 according to the host server's security policy for impersonators. In the embodiment shown, the security policy selected uses a dummy website containing innocuous public information to satisfy any more requests from the hacker at terminal 13. In this example, it might appear to the impersonator at terminal 13 of FIG. 1 that he or she is in communication with a website 50, which is serving webpages stored on dummy disk 55 or on disk 40.
  • This makes it appear to the hacker that he or she has been successful, when that is not in fact the case. [0082]
  • The embodiments shown enable an entity to implement security policies which do not reveal the detection of impersonation to the impersonator. In the embodiments shown, the present invention augments security policies that make it appear to an interloper at almost every step that he or she has been successful, when, in fact, the opposite is true. Those skilled in the art will appreciate that some of the more obvious policies, such as informing the intruder that he or she has been detected and denying access could be implemented as well at various stages without deviating from the spirit of the invention. For example, a user might wish to deny access in a manner visible to the interloper without indicating to the interloper that he or she has been detected. [0083]
  • Now turning to FIG. 4A (Prior Art), a [0084] standard URL 110 is shown. This particular URL points to the location “overview.html” which is the address of a web page stored on disk 40 of an ordinary web server. FIG. 4B, in contrast shows a pseudo-URL, PURL 39 a of the present invention which appears identical to the standard URL of FIG. 4A (Prior Art), but does not point to any web page at all. Instead, it comprises a list of tasks stored in a private location to be performed in response to this request and the user's and the data's profiles.
  • The present invention includes a positive information profiling system PIPS, which enables the entity using the invention to create an account profile for all content and all clients so that data can be matched to requests for information. PIPS is described in more detail below. For the purposes of FIG. 4B, however, PURLs evaluation PURLS [0085] 39 a interacts with PIPS 39 b to determine what information can be sent to a particular requesting client. If the request from the user's PC terminal is a valid request for employee salary data, from a current employee, then the client profile for that employee might indicate that he or she has read-only access to his or her own salary data. If the employee requests data about the CEO's salary data, PURL 39 a may apply the corporation's profile for that employee to deny access to the CEO's salary, and instead supply the requesting employee's salary data. It would appear to the employee that the CEO and the employee have the same salary, when this is not so.
  • In the same way, PURLs can be used to select innocuous public data to be returned to a detected interloper, so that the interloper may be led to believe he or she has successfully breached the site. For example, if the interloper requests the CEO's salary data and the entity owning the website is a publicly held company, the latest publicly known data about the CEO's salary might be displayed to the interloper, while the CEO requesting his or own salary might be given the most current values. [0086]
  • Turning back to FIG. 1, virtual page [0087] publication system VPPS 39 c of the invention provides the pages or content to be served in response to the request as determined by the PURLs 39 a and PIPs 39 b evaluations of the request and the data. VPPS 39 c of the invention generates the proper responses and stores them as temporary pages or data on disk 40, which is accessible to web server 37. The virtual page is sent to the requesting client as the requested URL (and in encrypted form, if appropriate) and deleted from the system 35 and its associated storage disk 40. The source information used to generate virtual pages is stored at a location inaccessible to the web. In the embodiment shown this is computer 39 of FIG. 1, which is connected by a private network connection 10 to host computer 35. In another embodiment, if the server machine has sufficient Random Access Memory (RAM)—internal memory accessible directly to the central processing unit (CPU)—or “RAM disk” facility, the information need not be stored at all but simply sent from RAM's internal memory. Those skilled in the art will appreciate that various types of media can be used for storing information other than those mentioned here. Magnetic Tapes, for example, or RAID disk systems, writeable CD-ROM disks, or flash memory and so on could be used.
  • Still in FIG. 1, once the temporary page files have been deleted from [0088] disk 40 which is accessible to the Internet, VPPS 39 c keeps two security vigils. First, it checks to see if there are any files, other than the web root and the dummy security pages (if present) that are more than some specified amount of time old. If it finds such a file, VPPS 39 c can be directed by the security policy for that host to either delete such a file completely or move it to an “isolation ward” area specified by the user so it can be checked. This significantly lowers the risk of successful file deposition attacks on the web site. Files other than the ones that are supposed to be there (web root and dummy pages) are either deleted or, in effect, moved into “quarantine” and deleted from disk 40. In the embodiment shown, the invention also checks all files on disk 40 to see if the valid ones have been changed in that predefined interval as well. If they have been changed, they may have been corrupted, so they, too are either deleted or moved into isolation areas and, if so specified, the last valid content substituted for them. If the modified files so detected are those belonging to the web root or dummy pages, ASR 36 of the invention can also be guided by the security policies for the website in the handling of them. If no time has been specified by the user for the predefined interval, a default time, such as 60 seconds is used.
  • The second type of security vigil carried out by [0089] VPPS 39 c is for any new folder or directory created on the relevant storage disk(s). In the embodiments shown, these are deleted or moved to quarantine immediately, as soon as they are detected, without waiting for any interval.
  • Still in FIG. 1, while [0090] computer system 35 here is shown as a single website host computer, the present invention can be used to manage security for several different networks and host computers at one or more locations. This is shown more clearly in FIG. 16. There it can be seen that one computer site 35 f might be a multiple website host which provides web services to companies x, y and z, over private networks 10 x, 10 y and 10 z. In this example, disk 40 contains web roots x, y, and z for the respective companies. Terminal 00 might be a terminal for an employee of company x, and terminal 05 might be one for an employee of company y.
  • In the embodiment shown, [0091] ASR 36 of the present invention establishes security for each company's website by building and managing the security relationships between the company information stored off the Internet and valid clients such as employees at terminal 00 making requests over the Internet. In this embodiment, each company uses its copy of ASR 36 (ASRx, ASRy or ASR z) to define its own security policies, access levels and procedures. While the examples discussed so far are directed to the Internet, those skilled in the art will appreciate that the present invention can also be used in private networks, such as internal corporate networks, or other forms of network systems. ASR 36 can also be installed on several machines at different sites for handling security for just one entity, as well.
  • In addition, and still in FIG. 16, the installation and use of ASR[0092] 36 creates no visible difference to the outside world. The websites or locations using its services will generally appear the same to external clients or requestors as they would if the invention were not installed. At the host web site(s), each entity using the invention does need to allocate security levels to information sources.
  • In the embodiments shown, this allocation of security levels is done through the [0093] profiling system PIPS 39. As seen in FIG. 16, each entity x, y, or z, is able to create its own secure profile information on its own systems which are not directly exposed to the Internet.
  • At the outset of use, each entity using the invention installs [0094] ASR 36 software at the website host it is using (if the ASR 36 software is not already present) and the invention's EPCI 03 software at each client terminal 00 which is to be allowed access beyond the publicly available data.
  • With reference now to FIG. 12, a block diagram of the setup of [0095] ASR 36 is illustrated. At block 400, the ASR 36 software is setup for this entity. In the embodiments shown, each entity's copy of ASR 36 is given an authorized server activation key (ASAK). An ASAK is a unique string of 48 or more characters supplied with the product license for that entity. It is used to activate the configuration of the invention for that entity. A request to enter the ASAK is made when the ASR 36 product is first used. A further request is made of the purchasing entity to enter 30 or more characters of the entity's own devising. This is called the client confidence key, or CCK. In the embodiments shown, this could be any kind of key which the corporate entity believes will assist in uniquely identifying it. In the embodiments shown, this client confidence key CCK is encrypted as it is entered and kept in encrypted form by ASR 36. This means that ASR 36 does not “know” the unencrypted form of the CCK, and thus minimizes the risk of “backdoor” access to protected information even by the licensor of the product.
  • Next, at [0096] step 405 of FIG. 12, ASR 36 will generate its own authorized personalization key (APK) for this corporate entity by combining the ASAK and the CCK. The APK is unique to each installation of ASR 36 of the invention and is used to:
  • differentiate one [0097] ASR 36 server from another ASR 36;
  • provide a basis for generating unique authorized client activation keys (ACAKs) for clients of that [0098] ASR 36; and
  • provide a unique basis for encryption for communication with clients of that [0099] ASR 36, if desired.
  • In the embodiments shown, the user entity (here corporation x, y or z) is required to store a copy of its CCK and ASAK for use if there is ever a need to re-install the system. If the APK that is generated for an [0100] ASR 36 is ever changed, then none of the clients will have privileged access until they have all been issued with new ACAKs based on the changed APK.
  • Also in the embodiments shown, an authorized client activation key, ACAK must be generated for each client and will contain the APK and a unique identifier for that client. A new client is issued client identification key, (CIK) generation software and is also given its unique ACAK. When the client first runs the CIK generation software at its [0101] client terminal 00, it is prompted to enter its ACAK. The CIK generation software at that client terminal 00 then creates a unique CIK 120 for that client terminal and exits. No further client activation is required.
  • In the embodiments shown, this procedure may be repeated by the user for relationships with any number of different servers such as ASRy or ASRz of FIG. 16. The same CIK generation software must be used but the ACAK for each [0102] different server ASR 36 must be different.
  • Referring now to FIG. 17, some of the elements that can be used by the present invention's [0103] EPCI 03 software installed at terminal 00 to create a CIK 120 are shown.
  • In this example, it is assumed a client who is using the [0104] EPCI 03 software at his or her client terminal 00 is also using a personal computer C00 as his or her terminal 00. In exaggerated form, a disk D00 is shown which is attached to personal computer C00. Disk D00, in turn, contains a directory Dir00, which contains information about this particular computer C00 and its installed hardware and software components. For example, at line 001 information such as the central processor unit (CPU) serial number of computer C00 is stored, along with identification about the latest Read-Only-Memory (ROM) Revision made to that CPU. In addition, this example shows at line 002 that computer C00 has 32 megabytes of memory built-in and has configured its memory management to treat that as 128 megabytes of virtual memory. The volume serial number of disk D00 is given at line 003, along with an indicator of its type—HD for hard drive, as distinguished from removable media drives, such as floppy disk drives. Dir 00 also indicates at line 004 that this computer is using Operating System version 9.6 which was installed on Nov. 9, 1999. Dir 00 also shows, at line 005 that sound capability from ABC sound has been installed, with it version number and serial number. Next, at line 006, the particulars of the type of video display are given. Finally, starting at lines 007 and 008, a list of the software programs installed on that computer, possibly with their serial numbers and installation dates is stored. Those skilled in the art will appreciate that other such identifying characteristics for a client terminal or client requester can be used without deviating from the spirit of the invention,
  • From the example of FIG. 17, it can be seen that a considerable amount of information about this particular computer system C[0105] 00 is stored on disk D00. As mentioned earlier, it is also probable that the person using this system has his or her on logon name and password which is also stored somewhere in the system, depending on the type of operating system and local security used. Additionally, most present day computer systems whether handheld or mainframe are capable of keeping track of the current date and time at that computer and making that information available to programs running in that computer. If the computer is connected to a network such as an internal TCP/IP network, it also contains IP addressing information about itself.
  • Now turning back to FIG. 6, it can be seen that the CIK generation software of the present invention makes use of some or all of this kind of information and more to create a client identification key, [0106] CIK 120, that is unique to this particular user's installation, as illustrated in FIG. 6's Table T2. As mentioned earlier, the first access from this client terminal 00 is usually a public access, in which terminal 00 does not send a CIK. ASR 36 decides whether a relationship with terminal 00 has been established by the setup processes described above. If a relationship has been established, ASR 36 will automatically refresh the web page previously sent as a public page to obtain the correct CIK 120 from the client at terminal 00.
  • This is shown in more detail in FIG. 8, which is a flow diagram of the processing of [0107] EPCI 03 at terminal 00. At step 200, EPCI 03 receives a “send your CIK” request from ASR 36 running on the host/server machine. In the embodiments shown, every request for a page component is answered within an appropriate version of that page component plus a request for the client to send its CIK 120. The request also contains the APK for that particular server ASR 36. At step 205 EPCI 03 checks to see if there is a relationship with that particular server ASR 36. It does so by taking the APK sent with the request and comparing it with its own copy contained in the client's CIK file created by the CIK generation software. A client will have the APK from each server ASR 36 that provided it with an ACAK. If there is no match, and therefore, no relationship, EPCI 03 does nothing, at step 210. However, if there is a relationship, then EPCI 03 proceeds to step 215 to carry out any instructions from the host running the requesting ASR 36. It should be noted here that the instructions can be as simple as “reply with CIK”, to commands to run several programs or tasks and then reply with CIK. That is, the step of carrying out instructions from ASR 36 sent from the host server machine can be used to install new software at terminal 00, run other software, delete software or data, and so on. This step is not restricted solely to security checking. This feature provides entities using ASR 36 with significant options for communicating with or controlling the remote terminals.
  • Still in FIG. 8, once any instructions have been carried out, [0108] EPCI 03 running at terminal 00 prepares a new system signature at step 220. In the embodiments shown, a system signature is some extensible combination of the unique information stored locally at terminal 00. That is, some combination of the information described in FIG. 17 about the particular configuration of terminal 00 is used to create the system signature. For example, and referring back to FIG. 17, the system signature for this kind of computer C00 might include the serial number of the hard drive shown at line 003, the installation date of the operating system shown at line 004, the sound card serial number shown at line 005, and the version number of the DRAWPGM, shown at line 008. Different types of computers might have different system signatures. In the embodiments shown, EPCI 03 provides positive identification of the client machine being used. However, a system administrator might wish to further authenticate the person using that machine by adding a logon and password field to the system signature, for consistency with other internal procedures. In addition, other elements can be used by the present invention to form a unique system signature, such as smart card data or biometric identifiers, and so on.
  • Returning to FIG. 8, [0109] EPCI 03 checks at step 225 to see if the new system signature is the same as the old system signature stored in its CIK file. If it is, a new client identification key CIK 120 is created at step 230, indicating that the self-evaluation done by EPCI 03 on this machine passed the test and at step 240, the newly created CIK 120 is passed to ASR 36. If the system signature is not the same as the old one from a valid client, a new CIK 120 is created which includes a pass or fail indicator (see CIKSTATUS at Table T2 of FIG. 6), and new CIK 120 is passed to ASR 36 at the host. Note that this self-evaluation does not notify anyone at terminal 00 of the pass or fail status. In other words, the self-evaluation is done “silently” as it were.
  • Turning momentarily to FIG. 18, examples of passing and failing CIK's [0110] 120 are shown. At CIK 120-1, the volume serial number of the hard drive D00 attached to terminal 00 is shown in system signature S1 as 890765, which matches the one shown in FIG. 17. Assuming all the other portions of the system signature S1 matched the original one, a PASS indicator E1 is inserted into new CIK 120-1.
  • Still in FIG. 18, if a hacker has managed to copy the [0111] EPCI 03 software and the data transmitted from terminal 00, for creating keys, when the bootlegged copy of EPCI and its files is installed at the bootlegger's terminal, it will create a new CIK 120-2, which uses the serial number S2 of the hard drive attached to the bootlegger's terminal. This will not match the original system signature created for terminal 00 and passed from the host, so EPCI 03 will insert a fail indicator E2 in the new CIK 120-2 it generates. When the stolen software returns its new CIK 120-2, the intruder security policy for that server ASR 36 is activated.
  • Back in FIG. 8, once the self-checking performed by [0112] EPCI 03 has been completed, the new CIK 120 is passed to the host computer 35, at the next communication with ASR 36 on the host 35. Thus, self-evaluation is performed by EPCI 03 each and every time any data or object or request from terminal 00 is made to ASR 36 at host computer 35.
  • Turning now to FIG. 9, when [0113] ASR 36 receives a request for a data locator—in the embodiment shown, in uniform resource locator format—and CIK 120 from terminal 00, it passes that information to pseudo uniform resource locator processing PURLS 39 a. The flow diagram of FIG. 9 illustrates the processing performed by PURLS 39 a. At step 250, PURLS 39 a for this entity receives the client request from terminal 00, in this example. At step 255, PURLS 39 a extracts the client's CIK and passes that to the present invention's PIPS 39 b program.
  • Referring briefly to FIG. 13, [0114] PIPS 39 b at this juncture performs a number of identity checks at decision blocks 450, 455, 460, 480, and 485, checking the CIKSTATUS, system signature, ACAK, Session ID, and APK information. In the embodiments shown, reliance solely on the client CIK may not be sufficient for detecting a skilled hacker who learns how to construct a CIK. Checking other items such session id as well, provides additional safeguards. If the information fails any of these checks, PIPS 39 b proceeds to step 465. At 465, since identification has failed on one or more of the checks, the security logs of the present invention are updated. At step 470 the site identity is set to public for this response by PIPS 39 b. Finally, now that an identity problem has been logged, PIPS 39 b carries out the security policy actions which the user has specified for the particular type of error detected.
  • Still in FIG. 13, if all the checks have shown successful identification, [0115] PIPS 39 b at step 490 updates its audit logs, and then evaluates client group, security group and security level data at step 495.
  • Turning briefly to FIG. 19, it can be seen that the present invention allows each page and each client to have a predefined security level. The security level of a page must be matched by that of its intended recipient in order for the page to be published. In FIG. 19, some examples of access levels are shown. Inclusive access table IN[0116] 00 illustrates a structure in which a higher level of security automatically includes all lower levels. Thus if a client has access level 2, in table IN00, it will automatically have access to levels 0 and 1 as well.
  • Another option—exclusive access—is shown in table EX[0117] 00. There a client may have access to only one or two levels, but not to any others. For example, a client with access level B only has access to page level B. A client with access level AD has access only to page levels A and D. These two types could also be used in various combination to provide additional security options.
  • Also turning now to FIG. 15, a screen display of an account profile for a client is shown. In a similar fashion, a screen display for each page or section of the website can be used to create an account profile of the data secured by the present invention. [0118]
  • Returning to FIG. 13, after the security levels of the client requestor and the security levels of the requested data have been evaluated at [0119] step 495, the results of all this checking and evaluation are passed back to PURLS 39 a at step 500. At this point, the identity of the requestor has been verified (or not) and an appropriate level of response has been indicated, based on the security policy for that entity for that data.
  • Returning to FIG. 9, at [0120] step 265, the requested PURL sent by the client terminal 00 is extracted from the request and, at step 270, is sent back to PIPS 39 b for processing. This portion of PIPS 39 b processing is diagrammed in the flow diagram of FIG. 14. There, at step 505, PIPS 39 b selects the task(s) (contained in the request) that meets the client group, security group and security level data. As mentioned earlier, a PURL in the present invention is not the address of data or pages, as ordinary URLs are, but identifies a list of tasks to be performed. At step 510, PIPS 39 b carries out those task(s) and constructs a list of information components which will eventually be displayed in a web page or similar result and then, at step 515, PIPS 39 b passes this information back to PURLS 39 a.
  • Returning again to FIG. 9, PURLS [0121] 39 a receives this information and finds this list of valid components at step 275. At step 280, this list is passed to VPPS 39 c of the present invention.
  • Now referring to FIG. 10, virtual page [0122] publication system VPPS 39 c processing is shown. At step 550 the list of valid components is received from PURLS 39 a. Using that data, at step 555, VPPS prepares a temporary file containing one or more web pages and sends a copy of that temporary file to the web root. (Web root 42 of disk 40 in FIG. 1). Next, at step 560 VPPS 39 c passes the name of that temporary file to the webserver (webserver 37 in FIG. 1), which will cause that page(s) to be delivered to the client as the requested data. Next, at step 565, VPPS 39 c deletes the temporary file from the web root as soon as the data has been sent on its way. Thus, the web page only exists for a few milliseconds on disk 40 of FIG. 1, which is exposed to the Internet. Those skilled in the art will appreciate that the pseudo locators and virtual publication methods of the present invention could be applied to other network and security systems, in which protected information is only to be given to valid requestors of it.
  • Turning next to FIG. 11, another feature of [0123] VPPS 39 c processing is shown. Here, VPPS 39 c waits, at step 580, some predefined interval specified by the user. In the embodiments shown the interval is usually some small multiple of the “ping time” for an average use. Those skilled in the art are aware that the TCP/IP protocol allows a terminal to send data to a server and measure the time it takes to get to the server, usually a few milliseconds. If the ping time is 10 milliseconds, the interval specified by the user to VPPS 39 c might be 20 milliseconds. Usually it is an interval that is just long enough to let a valid message go through and the temporary file stored on the web root to be deleted. Once that interval has expired, VPPS 39 c checks at step 590 of FIG. 11 to see if any new file has been created on the disk containing web root 42. If VPPS 39 c determines that a new file has been created and stored there it is automatically assumed to be suspect. Depending on the security policy for the website, VPPS 39 c will either delete the file completely at Step 595 or move it to a “safe” area, isolated from both the Internet network and the user's internal network. In this way, file deposition attacks can usually be detected and dealt with immediately. VPPS 39 c also checks to see if any of the legitimate files on the web root have been modified. If they have, they, too can be deleted or moved, and at the discretion of the user, the original state of the file can be restored or not. Those skilled in the art will appreciate that the interval used can be varied as circumstances or risk levels (or both) change. As mentioned above, in the embodiment shown, any new folders or directories are deleted or moved as soon as they are detected. Those skilled in the art will appreciate that different actions could be taken at this point, without deviating from the scope of the invention.
  • Thus it can be seen that in the embodiments shown, the present invention ensures that all users of the network or system it monitors are managed and monitored throughout the duration of their sessions with the server at the host computer and that the information provided to them is appropriate to their pre-defined status. [0124] EPCI 03 authenticates the CIK at every access and web page component. The system as a whole is scalable for any number of client entities or number of relationships. It also verifies its own integrity, and reports success or failure through the audit and security logs. It can be used in combination with other security measures such as VPNs, Secure Socket Layer (SSL) technology which encrypts data sent between client and server computers, and X.509 Digital Certificates or Digital Ids. The client identification key is self-checking and aware of its environs so it ensures that if the client identification key is copied and used on another terminal, it will fail, and report the type of failure.
  • The client identification key responds only to a server with which the client has a known and agreed upon relationship. With the embodiments shown, clients do not need to take special actions such as using logon or passwords. [0125]
  • In the embodiments shown, [0126] ASR 36 requires that a client must first be enrolled on the secure system by creating a client account as described above.
  • In the embodiments shown, the present invention is implemented in the C++, VISUAL BASIC™ (from Microsoft, Inc.), and POWERBASIC™ (from POWERBASIC™ Inc. in Carmel, Calif.) languages, but those skilled in the art will appreciate that it could also be implemented in other languages such as Perl, C, Java and so on. Similarly, while the embodiments shown are implemented in software, part or all of the invention could also be embodied in firmware or circuitry, if desired. Also, as mentioned earlier, while the embodiments shown are directed to use with networks and systems using the TCP/IP protocol, other network or system protocols could be used. Those skilled in the art will also appreciate that the embodiments described above are illustrative only, and that other systems in the spirit of the teachings herein fall within the scope of the invention. [0127]

Claims (1)

What is claimed is:
1. A method for positively identifying a valid client terminal communicating with a host machine, comprising the steps of:
creating a system signature for the client terminal, the system signature including configuration information likely to be unique to that client terminal;
generating a first client identification key containing the system signature and storing the first client identification key at the client terminal;
re-evaluating the system signature each time a communication is purportedly sent from the client terminal by creating a new system signature unique to the then sending terminal with the system signature stored with the first client identification key; and
generating a second client identification key containing an indicator that silently informs the host system whether the sending terminal is the same as the client terminal.
US10/228,786 1999-10-05 2002-08-26 System and method for extensible positive client identification Abandoned US20030005287A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/228,786 US20030005287A1 (en) 1999-10-05 2002-08-26 System and method for extensible positive client identification

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/412,242 US6442696B1 (en) 1999-10-05 1999-10-05 System and method for extensible positive client identification
US10/228,786 US20030005287A1 (en) 1999-10-05 2002-08-26 System and method for extensible positive client identification

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/412,242 Continuation US6442696B1 (en) 1999-10-05 1999-10-05 System and method for extensible positive client identification

Publications (1)

Publication Number Publication Date
US20030005287A1 true US20030005287A1 (en) 2003-01-02

Family

ID=23632215

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/412,242 Expired - Fee Related US6442696B1 (en) 1999-10-05 1999-10-05 System and method for extensible positive client identification
US10/228,786 Abandoned US20030005287A1 (en) 1999-10-05 2002-08-26 System and method for extensible positive client identification

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/412,242 Expired - Fee Related US6442696B1 (en) 1999-10-05 1999-10-05 System and method for extensible positive client identification

Country Status (2)

Country Link
US (2) US6442696B1 (en)
GB (1) GB2355322A (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037377A1 (en) * 2000-04-27 2001-11-01 Yumiko Nakano Information searching apparatus and method
US20040078422A1 (en) * 2002-10-17 2004-04-22 Toomey Christopher Newell Detecting and blocking spoofed Web login pages
US20050188215A1 (en) * 2004-02-20 2005-08-25 Imperva, Inc. Method and apparatus for high-speed detection and blocking of zero day worm attacks
US20050278542A1 (en) * 2004-06-14 2005-12-15 Greg Pierson Network security and fraud detection system and method
US20050289084A1 (en) * 2004-06-25 2005-12-29 The Go Daddy Group, Inc. Method for a Web site with a proxy domain name registration to receive a secure socket layer certificate
US20060031492A1 (en) * 2004-06-25 2006-02-09 The Go Daddy Group, Inc. Automated process for a Web site to receive a secure socket layer certificate
US20060168116A1 (en) * 2004-06-25 2006-07-27 The Go Daddy Group, Inc. Methods of issuing a domain name certificate
US20070143288A1 (en) * 2004-02-23 2007-06-21 Kazutoshi Kichikawa Information processing apparatus, and method for retaining security
US20070288482A1 (en) * 2006-06-09 2007-12-13 Bea Systems, Inc. RFID Workflow Client
US20070288520A1 (en) * 2006-06-09 2007-12-13 Bea Systems, Inc. Workflow improvements
US20080104684A1 (en) * 2006-10-25 2008-05-01 Iovation, Inc. Creating and verifying globally unique device-specific identifiers
US8468598B2 (en) 2010-08-16 2013-06-18 Sap Ag Password protection techniques using false passwords
US8522147B2 (en) 2011-09-20 2013-08-27 Go Daddy Operating Company, LLC Methods for verifying person's identity through person's social circle using person's photograph
US8538065B2 (en) 2011-09-20 2013-09-17 Go Daddy Operating Company, LLC Systems for verifying person's identity through person's social circle using person's photograph
US8676684B2 (en) 2010-04-12 2014-03-18 Iovation Inc. System and method for evaluating risk in fraud prevention
US8738604B2 (en) 2012-03-30 2014-05-27 Go Daddy Operating Company, LLC Methods for discovering sensitive information on computer networks
US8738605B2 (en) 2012-03-30 2014-05-27 Go Daddy Operating Company, LLC Systems for discovering sensitive information on computer networks
US8843752B1 (en) * 2011-01-24 2014-09-23 Prima Cimema, Inc. Multi-factor device authentication
US8925080B2 (en) * 2011-12-20 2014-12-30 Sap Se Deception-based network security using false positive responses to unauthorized access requests
US9141789B1 (en) 2013-07-16 2015-09-22 Go Daddy Operating Company, LLC Mitigating denial of service attacks
US9141669B2 (en) 2013-01-22 2015-09-22 Go Daddy Operating Company, LLC Configuring an origin server content delivery using a pulled data list
US9160809B2 (en) 2012-11-26 2015-10-13 Go Daddy Operating Company, LLC DNS overriding-based methods of accelerating content delivery
US9178888B2 (en) 2013-06-14 2015-11-03 Go Daddy Operating Company, LLC Method for domain control validation
US9286331B2 (en) 2010-05-06 2016-03-15 Go Daddy Operating Company, LLC Verifying and balancing server resources via stored usage data
US9384208B2 (en) 2013-01-22 2016-07-05 Go Daddy Operating Company, LLC Configuring a cached website file removal using a pulled data list
US9438493B2 (en) 2013-01-31 2016-09-06 Go Daddy Operating Company, LLC Monitoring network entities via a central monitoring system
US9521138B2 (en) 2013-06-14 2016-12-13 Go Daddy Operating Company, LLC System for domain control validation
US10339278B2 (en) 2015-11-04 2019-07-02 Screening Room Media, Inc. Monitoring nearby mobile computing devices to prevent digital content misuse
US20190281064A1 (en) * 2018-03-09 2019-09-12 Microsoft Technology Licensing, Llc System and method for restricting access to web resources
US10452819B2 (en) 2017-03-20 2019-10-22 Screening Room Media, Inc. Digital credential system

Families Citing this family (188)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6567920B1 (en) * 1999-03-31 2003-05-20 International Business Machines Corporation Data processing system and method for authentication of devices external to a secure network utilizing client identifier
JP3494590B2 (en) * 1999-06-18 2004-02-09 富士通株式会社 Transmission / reception system and transmission device
US9843447B1 (en) * 1999-09-09 2017-12-12 Secure Axcess Llc Authenticating electronic content
US7051368B1 (en) * 1999-11-09 2006-05-23 Microsoft Corporation Methods and systems for screening input strings intended for use by web servers
US6859805B1 (en) * 1999-11-29 2005-02-22 Actuate Corporation Method and apparatus for generating page-level security in a computer generated report
US7143144B2 (en) * 1999-11-30 2006-11-28 Ricoh Company, Ltd. System, method and computer readable medium for certifying release of electronic information on an internet
US7103598B1 (en) * 2000-03-03 2006-09-05 Micron Technology, Inc Software distribution method and apparatus
US7123700B1 (en) * 2000-04-27 2006-10-17 Nortel Networks Limited Configuring user interfaces of call devices
US6968458B1 (en) * 2000-04-28 2005-11-22 Ian Ruddle Apparatus and method for providing secure communication on a network
GB0017479D0 (en) 2000-07-18 2000-08-30 Bit Arts Ltd Transaction verification
US9928508B2 (en) 2000-08-04 2018-03-27 Intellectual Ventures I Llc Single sign-on for access to a central data repository
GB2368151A (en) * 2000-10-19 2002-04-24 One Zone Networks Determining access privilege to electronic data
US6714970B1 (en) * 2000-10-26 2004-03-30 International Business Machines Corporation Protecting open world wide web sites from known malicious users by diverting requests from malicious users to alias addresses for the protected sites
US6961633B1 (en) * 2000-11-13 2005-11-01 Schneider Automation Inc. Remote monitoring of factory automation users
JP3392828B2 (en) * 2001-01-10 2003-03-31 株式会社東芝 Distributed processing system, drawing command transfer method in the system, and computer-readable storage medium
KR100615470B1 (en) * 2001-05-09 2006-08-25 (주)트라이옵스 Cracker tracing and certification System Using for Web Agent and method thereof
US8701170B1 (en) * 2001-05-11 2014-04-15 Kount Inc. System for secure enrollment and secure verification of network users by a centralized identification service
US7426635B1 (en) * 2001-06-28 2008-09-16 Entrust Technologies Limited Bulk certificate lifetime allocation systems, components and methods
FR2827976B1 (en) * 2001-07-25 2004-01-23 Gemplus Card Int PROTECTION OF PERSONAL DATA READ IN A TERMINAL STATION BY A SERVER
US7320033B2 (en) * 2001-07-27 2008-01-15 Intel Corporation Dynamic local drive and printer sharing
US7555364B2 (en) * 2001-08-22 2009-06-30 MMI Controls, L.P. Adaptive hierarchy usage monitoring HVAC control system
US7076797B2 (en) * 2001-10-05 2006-07-11 Microsoft Corporation Granular authorization for network user sessions
US8473321B2 (en) * 2002-09-25 2013-06-25 Hewlett-Packard Development Company, L.P. Method and apparatus for associating privileges with people in an organization
US7356576B2 (en) * 2002-10-01 2008-04-08 Hewlett-Packard Development Company, L.P. Method, apparatus, and computer readable medium for providing network storage assignments
WO2004040408A2 (en) * 2002-10-25 2004-05-13 Grand Virtual, Inc. Fixed client identification system for positive identification of client to server
US7143095B2 (en) * 2002-12-31 2006-11-28 American Express Travel Related Services Company, Inc. Method and system for implementing and managing an enterprise identity management for distributed security
US20110202565A1 (en) * 2002-12-31 2011-08-18 American Express Travel Related Services Company, Inc. Method and system for implementing and managing an enterprise identity management for distributed security in a computer system
US7263609B1 (en) 2003-04-29 2007-08-28 Cisco Technology, Inc. Method and apparatus for packet quarantine processing over a secure connection
US20050039007A1 (en) * 2003-08-13 2005-02-17 Keith Hoene Multi-function product profile downloading after authentication
US7503067B2 (en) * 2004-02-02 2009-03-10 Toshiba Corporation Preset security levels
US20090288147A1 (en) * 2004-02-02 2009-11-19 Michael Yeung System and method for modifying security functions of an associated document processing device
BRPI0400265A (en) * 2004-03-10 2006-02-07 Legitimi Ltd Requesting device hardware and software subscription-based information service access control system
US8793787B2 (en) 2004-04-01 2014-07-29 Fireeye, Inc. Detecting malicious network content using virtual environment components
US8171553B2 (en) 2004-04-01 2012-05-01 Fireeye, Inc. Heuristic based capture with replay to virtual machine
US8881282B1 (en) 2004-04-01 2014-11-04 Fireeye, Inc. Systems and methods for malware attack detection and identification
US8528086B1 (en) 2004-04-01 2013-09-03 Fireeye, Inc. System and method of detecting computer worms
US7587537B1 (en) 2007-11-30 2009-09-08 Altera Corporation Serializer-deserializer circuits formed from input-output circuit registers
US8584239B2 (en) 2004-04-01 2013-11-12 Fireeye, Inc. Virtual machine with dynamic data flow analysis
US9106694B2 (en) 2004-04-01 2015-08-11 Fireeye, Inc. Electronic message analysis for malware detection
US8549638B2 (en) 2004-06-14 2013-10-01 Fireeye, Inc. System and method of containing computer worms
US8566946B1 (en) 2006-04-20 2013-10-22 Fireeye, Inc. Malware containment on connection
US8898788B1 (en) 2004-04-01 2014-11-25 Fireeye, Inc. Systems and methods for malware attack prevention
US8527752B2 (en) * 2004-06-16 2013-09-03 Dormarke Assets Limited Liability Graduated authentication in an identity management system
US8099600B2 (en) * 2004-08-23 2012-01-17 International Business Machines Corporation Content distribution site spoofing detection and prevention
US7657507B2 (en) * 2007-03-02 2010-02-02 Microsoft Corporation Pseudo-anchor text extraction for vertical search
US8782021B2 (en) * 2007-10-20 2014-07-15 Citrix Systems, Inc. Systems and methods for folder redirection
US8412932B2 (en) * 2008-02-28 2013-04-02 Red Hat, Inc. Collecting account access statistics from information provided by presence of client certificates
US8621641B2 (en) * 2008-02-29 2013-12-31 Vicki L. James Systems and methods for authorization of information access
US8850571B2 (en) 2008-11-03 2014-09-30 Fireeye, Inc. Systems and methods for detecting malicious network content
US8997219B2 (en) 2008-11-03 2015-03-31 Fireeye, Inc. Systems and methods for detecting malicious PDF network content
US9047450B2 (en) 2009-06-19 2015-06-02 Deviceauthority, Inc. Identification of embedded system devices
US9047458B2 (en) 2009-06-19 2015-06-02 Deviceauthority, Inc. Network access protection
US8213907B2 (en) 2009-07-08 2012-07-03 Uniloc Luxembourg S. A. System and method for secured mobile communication
US8832829B2 (en) 2009-09-30 2014-09-09 Fireeye, Inc. Network-based binary file extraction and analysis for malware detection
US8726407B2 (en) 2009-10-16 2014-05-13 Deviceauthority, Inc. Authentication of computing and communications hardware
AU2011100168B4 (en) 2011-02-09 2011-06-30 Device Authority Ltd Device-bound certificate authentication
GB2491101B (en) 2011-04-15 2013-07-10 Bluecava Inc Detection of spoofing of remote client system information
AU2011101295B4 (en) 2011-06-13 2012-08-02 Device Authority Ltd Hardware identity in multi-factor authentication layer
AU2011101297B4 (en) 2011-08-15 2012-06-14 Uniloc Usa, Inc. Remote recognition of an association between remote devices
US10572665B2 (en) 2012-12-28 2020-02-25 Fireeye, Inc. System and method to create a number of breakpoints in a virtual machine via virtual machine trapping events
US9195829B1 (en) 2013-02-23 2015-11-24 Fireeye, Inc. User interface with real-time visual playback along with synchronous textual analysis log display and event/time index for anomalous behavior detection in applications
US9176843B1 (en) 2013-02-23 2015-11-03 Fireeye, Inc. Framework for efficient security coverage of mobile software applications
US8990944B1 (en) 2013-02-23 2015-03-24 Fireeye, Inc. Systems and methods for automatically detecting backdoors
US9367681B1 (en) 2013-02-23 2016-06-14 Fireeye, Inc. Framework for efficient security coverage of mobile software applications using symbolic execution to reach regions of interest within an application
US9009823B1 (en) 2013-02-23 2015-04-14 Fireeye, Inc. Framework for efficient security coverage of mobile software applications installed on mobile devices
US9355247B1 (en) 2013-03-13 2016-05-31 Fireeye, Inc. File extraction from memory dump for malicious content analysis
US9143496B2 (en) 2013-03-13 2015-09-22 Uniloc Luxembourg S.A. Device authentication using device environment information
US9104867B1 (en) 2013-03-13 2015-08-11 Fireeye, Inc. Malicious content analysis using simulated user interaction without user involvement
US9626509B1 (en) 2013-03-13 2017-04-18 Fireeye, Inc. Malicious content analysis with multi-version application support within single operating environment
US9565202B1 (en) * 2013-03-13 2017-02-07 Fireeye, Inc. System and method for detecting exfiltration content
US9430646B1 (en) 2013-03-14 2016-08-30 Fireeye, Inc. Distributed systems and methods for automatically detecting unknown bots and botnets
US9311479B1 (en) 2013-03-14 2016-04-12 Fireeye, Inc. Correlation and consolidation of analytic data for holistic view of a malware attack
US9286466B2 (en) 2013-03-15 2016-03-15 Uniloc Luxembourg S.A. Registration and authentication of computing devices using a digital skeleton key
US10713358B2 (en) 2013-03-15 2020-07-14 Fireeye, Inc. System and method to extract and utilize disassembly features to classify software intent
WO2014145805A1 (en) 2013-03-15 2014-09-18 Mandiant, Llc System and method employing structured intelligence to verify and contain threats at endpoints
US9495180B2 (en) 2013-05-10 2016-11-15 Fireeye, Inc. Optimized resource allocation for virtual machines within a malware content detection system
US9635039B1 (en) 2013-05-13 2017-04-25 Fireeye, Inc. Classifying sets of malicious indicators for detecting command and control communications associated with malware
US10133863B2 (en) 2013-06-24 2018-11-20 Fireeye, Inc. Zero-day discovery system
US9300686B2 (en) 2013-06-28 2016-03-29 Fireeye, Inc. System and method for detecting malicious links in electronic messages
US9690936B1 (en) 2013-09-30 2017-06-27 Fireeye, Inc. Multistage system and method for analyzing obfuscated content for malware
US9294501B2 (en) 2013-09-30 2016-03-22 Fireeye, Inc. Fuzzy hash of behavioral results
US10515214B1 (en) 2013-09-30 2019-12-24 Fireeye, Inc. System and method for classifying malware within content created during analysis of a specimen
US9628507B2 (en) 2013-09-30 2017-04-18 Fireeye, Inc. Advanced persistent threat (APT) detection center
US9171160B2 (en) 2013-09-30 2015-10-27 Fireeye, Inc. Dynamically adaptive framework and method for classifying malware using intelligent static, emulation, and dynamic analyses
US9736179B2 (en) 2013-09-30 2017-08-15 Fireeye, Inc. System, apparatus and method for using malware analysis results to drive adaptive instrumentation of virtual machines to improve exploit detection
US9921978B1 (en) 2013-11-08 2018-03-20 Fireeye, Inc. System and method for enhanced security of storage devices
US9756074B2 (en) 2013-12-26 2017-09-05 Fireeye, Inc. System and method for IPS and VM-based detection of suspicious objects
US9747446B1 (en) 2013-12-26 2017-08-29 Fireeye, Inc. System and method for run-time object classification
US9740857B2 (en) 2014-01-16 2017-08-22 Fireeye, Inc. Threat-aware microvisor
US9262635B2 (en) 2014-02-05 2016-02-16 Fireeye, Inc. Detection efficacy of virtual machine-based analysis with application specific events
US9241010B1 (en) 2014-03-20 2016-01-19 Fireeye, Inc. System and method for network behavior detection
US10242185B1 (en) 2014-03-21 2019-03-26 Fireeye, Inc. Dynamic guest image creation and rollback
US9591015B1 (en) 2014-03-28 2017-03-07 Fireeye, Inc. System and method for offloading packet processing and static analysis operations
US9223972B1 (en) 2014-03-31 2015-12-29 Fireeye, Inc. Dynamically remote tuning of a malware content detection system
US9432389B1 (en) 2014-03-31 2016-08-30 Fireeye, Inc. System, apparatus and method for detecting a malicious attack based on static analysis of a multi-flow object
US9594912B1 (en) 2014-06-06 2017-03-14 Fireeye, Inc. Return-oriented programming detection
US9973531B1 (en) 2014-06-06 2018-05-15 Fireeye, Inc. Shellcode detection
US9438623B1 (en) 2014-06-06 2016-09-06 Fireeye, Inc. Computer exploit detection using heap spray pattern matching
US10084813B2 (en) 2014-06-24 2018-09-25 Fireeye, Inc. Intrusion prevention and remedy system
US9398028B1 (en) 2014-06-26 2016-07-19 Fireeye, Inc. System, device and method for detecting a malicious attack based on communcations between remotely hosted virtual machines and malicious web servers
US10805340B1 (en) 2014-06-26 2020-10-13 Fireeye, Inc. Infection vector and malware tracking with an interactive user display
US10002252B2 (en) 2014-07-01 2018-06-19 Fireeye, Inc. Verification of trusted threat-aware microvisor
US9363280B1 (en) 2014-08-22 2016-06-07 Fireeye, Inc. System and method of detecting delivery of malware using cross-customer data
KR101547999B1 (en) * 2014-09-02 2015-08-27 한국전자통신연구원 Apparatus and method for automatically detecting malicious links
US10671726B1 (en) 2014-09-22 2020-06-02 Fireeye Inc. System and method for malware analysis using thread-level event monitoring
US9773112B1 (en) 2014-09-29 2017-09-26 Fireeye, Inc. Exploit detection of malware and malware families
US10027689B1 (en) 2014-09-29 2018-07-17 Fireeye, Inc. Interactive infection visualization for improved exploit detection and signature generation for malware and malware families
US9690933B1 (en) 2014-12-22 2017-06-27 Fireeye, Inc. Framework for classifying an object as malicious with machine learning for deploying updated predictive models
US10075455B2 (en) 2014-12-26 2018-09-11 Fireeye, Inc. Zero-day rotating guest image profile
US9934376B1 (en) 2014-12-29 2018-04-03 Fireeye, Inc. Malware detection appliance architecture
US9838417B1 (en) 2014-12-30 2017-12-05 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US10148693B2 (en) 2015-03-25 2018-12-04 Fireeye, Inc. Exploit detection system
US9690606B1 (en) 2015-03-25 2017-06-27 Fireeye, Inc. Selective system call monitoring
US9438613B1 (en) 2015-03-30 2016-09-06 Fireeye, Inc. Dynamic content activation for automated analysis of embedded objects
US10474813B1 (en) 2015-03-31 2019-11-12 Fireeye, Inc. Code injection technique for remediation at an endpoint of a network
US10417031B2 (en) 2015-03-31 2019-09-17 Fireeye, Inc. Selective virtualization for security threat detection
US9483644B1 (en) 2015-03-31 2016-11-01 Fireeye, Inc. Methods for detecting file altering malware in VM based analysis
US9654485B1 (en) 2015-04-13 2017-05-16 Fireeye, Inc. Analytics-based security monitoring system and method
US9594904B1 (en) 2015-04-23 2017-03-14 Fireeye, Inc. Detecting malware based on reflection
US10454950B1 (en) 2015-06-30 2019-10-22 Fireeye, Inc. Centralized aggregation technique for detecting lateral movement of stealthy cyber-attacks
US10726127B1 (en) 2015-06-30 2020-07-28 Fireeye, Inc. System and method for protecting a software component running in a virtual machine through virtual interrupts by the virtualization layer
US10642753B1 (en) 2015-06-30 2020-05-05 Fireeye, Inc. System and method for protecting a software component running in virtual machine using a virtualization layer
US11113086B1 (en) 2015-06-30 2021-09-07 Fireeye, Inc. Virtual system and method for securing external network connectivity
US10715542B1 (en) 2015-08-14 2020-07-14 Fireeye, Inc. Mobile application risk analysis
US10176321B2 (en) 2015-09-22 2019-01-08 Fireeye, Inc. Leveraging behavior-based rules for malware family classification
US10033747B1 (en) 2015-09-29 2018-07-24 Fireeye, Inc. System and method for detecting interpreter-based exploit attacks
US10210329B1 (en) 2015-09-30 2019-02-19 Fireeye, Inc. Method to detect application execution hijacking using memory protection
US10817606B1 (en) 2015-09-30 2020-10-27 Fireeye, Inc. Detecting delayed activation malware using a run-time monitoring agent and time-dilation logic
US10706149B1 (en) 2015-09-30 2020-07-07 Fireeye, Inc. Detecting delayed activation malware using a primary controller and plural time controllers
US9825989B1 (en) 2015-09-30 2017-11-21 Fireeye, Inc. Cyber attack early warning system
US10601865B1 (en) 2015-09-30 2020-03-24 Fireeye, Inc. Detection of credential spearphishing attacks using email analysis
US9825976B1 (en) 2015-09-30 2017-11-21 Fireeye, Inc. Detection and classification of exploit kits
US10284575B2 (en) 2015-11-10 2019-05-07 Fireeye, Inc. Launcher for setting analysis environment variations for malware detection
US10846117B1 (en) 2015-12-10 2020-11-24 Fireeye, Inc. Technique for establishing secure communication between host and guest processes of a virtualization architecture
US10447728B1 (en) 2015-12-10 2019-10-15 Fireeye, Inc. Technique for protecting guest processes using a layered virtualization architecture
US10108446B1 (en) 2015-12-11 2018-10-23 Fireeye, Inc. Late load technique for deploying a virtualization layer underneath a running operating system
US10133866B1 (en) 2015-12-30 2018-11-20 Fireeye, Inc. System and method for triggering analysis of an object for malware in response to modification of that object
US10050998B1 (en) 2015-12-30 2018-08-14 Fireeye, Inc. Malicious message analysis system
US10621338B1 (en) 2015-12-30 2020-04-14 Fireeye, Inc. Method to detect forgery and exploits using last branch recording registers
US10565378B1 (en) 2015-12-30 2020-02-18 Fireeye, Inc. Exploit of privilege detection framework
US9824216B1 (en) 2015-12-31 2017-11-21 Fireeye, Inc. Susceptible environment detection system
US11552986B1 (en) 2015-12-31 2023-01-10 Fireeye Security Holdings Us Llc Cyber-security framework for application of virtual features
US10581874B1 (en) 2015-12-31 2020-03-03 Fireeye, Inc. Malware detection system with contextual analysis
US10601863B1 (en) 2016-03-25 2020-03-24 Fireeye, Inc. System and method for managing sensor enrollment
US10671721B1 (en) 2016-03-25 2020-06-02 Fireeye, Inc. Timeout management services
US10785255B1 (en) 2016-03-25 2020-09-22 Fireeye, Inc. Cluster configuration within a scalable malware detection system
US10476906B1 (en) 2016-03-25 2019-11-12 Fireeye, Inc. System and method for managing formation and modification of a cluster within a malware detection system
US10893059B1 (en) 2016-03-31 2021-01-12 Fireeye, Inc. Verification and enhancement using detection systems located at the network periphery and endpoint devices
US10169585B1 (en) 2016-06-22 2019-01-01 Fireeye, Inc. System and methods for advanced malware detection through placement of transition events
US10462173B1 (en) 2016-06-30 2019-10-29 Fireeye, Inc. Malware detection verification and enhancement by coordinating endpoint and malware detection systems
US10592678B1 (en) 2016-09-09 2020-03-17 Fireeye, Inc. Secure communications between peers using a verified virtual trusted platform module
US10491627B1 (en) 2016-09-29 2019-11-26 Fireeye, Inc. Advanced malware detection using similarity analysis
US10795991B1 (en) 2016-11-08 2020-10-06 Fireeye, Inc. Enterprise search
US10587647B1 (en) 2016-11-22 2020-03-10 Fireeye, Inc. Technique for malware detection capability comparison of network security devices
US10552610B1 (en) 2016-12-22 2020-02-04 Fireeye, Inc. Adaptive virtual machine snapshot update framework for malware behavioral analysis
US10581879B1 (en) 2016-12-22 2020-03-03 Fireeye, Inc. Enhanced malware detection for generated objects
US10523609B1 (en) 2016-12-27 2019-12-31 Fireeye, Inc. Multi-vector malware detection and analysis
US10904286B1 (en) 2017-03-24 2021-01-26 Fireeye, Inc. Detection of phishing attacks using similarity analysis
US10848397B1 (en) 2017-03-30 2020-11-24 Fireeye, Inc. System and method for enforcing compliance with subscription requirements for cyber-attack detection service
US10798112B2 (en) 2017-03-30 2020-10-06 Fireeye, Inc. Attribute-controlled malware detection
US10791138B1 (en) 2017-03-30 2020-09-29 Fireeye, Inc. Subscription-based malware detection
US10902119B1 (en) 2017-03-30 2021-01-26 Fireeye, Inc. Data extraction system for malware analysis
US10601848B1 (en) 2017-06-29 2020-03-24 Fireeye, Inc. Cyber-security system and method for weak indicator detection and correlation to generate strong indicators
US10855700B1 (en) 2017-06-29 2020-12-01 Fireeye, Inc. Post-intrusion detection of cyber-attacks during lateral movement within networks
US10503904B1 (en) 2017-06-29 2019-12-10 Fireeye, Inc. Ransomware detection and mitigation
US10893068B1 (en) 2017-06-30 2021-01-12 Fireeye, Inc. Ransomware file modification prevention technique
US10747872B1 (en) 2017-09-27 2020-08-18 Fireeye, Inc. System and method for preventing malware evasion
US10805346B2 (en) 2017-10-01 2020-10-13 Fireeye, Inc. Phishing attack detection
US11108809B2 (en) 2017-10-27 2021-08-31 Fireeye, Inc. System and method for analyzing binary code for malware classification using artificial neural network techniques
US11005860B1 (en) 2017-12-28 2021-05-11 Fireeye, Inc. Method and system for efficient cybersecurity analysis of endpoint events
US11271955B2 (en) 2017-12-28 2022-03-08 Fireeye Security Holdings Us Llc Platform and method for retroactive reclassification employing a cybersecurity-based global data store
US11240275B1 (en) 2017-12-28 2022-02-01 Fireeye Security Holdings Us Llc Platform and method for performing cybersecurity analyses employing an intelligence hub with a modular architecture
US10826931B1 (en) 2018-03-29 2020-11-03 Fireeye, Inc. System and method for predicting and mitigating cybersecurity system misconfigurations
US11003773B1 (en) 2018-03-30 2021-05-11 Fireeye, Inc. System and method for automatically generating malware detection rule recommendations
US11558401B1 (en) 2018-03-30 2023-01-17 Fireeye Security Holdings Us Llc Multi-vector malware detection data sharing system for improved detection
US10956477B1 (en) 2018-03-30 2021-03-23 Fireeye, Inc. System and method for detecting malicious scripts through natural language processing modeling
US11314859B1 (en) 2018-06-27 2022-04-26 FireEye Security Holdings, Inc. Cyber-security system and method for detecting escalation of privileges within an access token
US11075930B1 (en) 2018-06-27 2021-07-27 Fireeye, Inc. System and method for detecting repetitive cybersecurity attacks constituting an email campaign
US11228491B1 (en) 2018-06-28 2022-01-18 Fireeye Security Holdings Us Llc System and method for distributed cluster configuration monitoring and management
US11316900B1 (en) 2018-06-29 2022-04-26 FireEye Security Holdings Inc. System and method for automatically prioritizing rules for cyber-threat detection and mitigation
US11182473B1 (en) 2018-09-13 2021-11-23 Fireeye Security Holdings Us Llc System and method for mitigating cyberattacks against processor operability by a guest process
US11763004B1 (en) 2018-09-27 2023-09-19 Fireeye Security Holdings Us Llc System and method for bootkit detection
US11368475B1 (en) 2018-12-21 2022-06-21 Fireeye Security Holdings Us Llc System and method for scanning remote services to locate stored objects with malware
US11258806B1 (en) 2019-06-24 2022-02-22 Mandiant, Inc. System and method for automatically associating cybersecurity intelligence to cyberthreat actors
US11556640B1 (en) 2019-06-27 2023-01-17 Mandiant, Inc. Systems and methods for automated cybersecurity analysis of extracted binary string sets
US11392700B1 (en) 2019-06-28 2022-07-19 Fireeye Security Holdings Us Llc System and method for supporting cross-platform data verification
US11886585B1 (en) 2019-09-27 2024-01-30 Musarubra Us Llc System and method for identifying and mitigating cyberattacks through malicious position-independent code execution
US11637862B1 (en) 2019-09-30 2023-04-25 Mandiant, Inc. System and method for surfacing cyber-security threats with a self-learning recommendation engine

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4796220A (en) * 1986-12-15 1989-01-03 Pride Software Development Corp. Method of controlling the copying of software
JPS63301350A (en) * 1987-06-01 1988-12-08 Hitachi Ltd Preventing system for wrong access of host computer information given from terminal equipment
US4933971A (en) * 1989-03-14 1990-06-12 Tandem Computers Incorporated Method for encrypting transmitted data using a unique key
US5337357A (en) * 1993-06-17 1994-08-09 Software Security, Inc. Method of software distribution protection
US5646992A (en) * 1993-09-23 1997-07-08 Digital Delivery, Inc. Assembly, distribution, and use of digital information
US5758257A (en) 1994-11-29 1998-05-26 Herz; Frederick System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US5761305A (en) 1995-04-21 1998-06-02 Certicom Corporation Key agreement and transport protocol with implicit signatures
US5877765A (en) 1995-09-11 1999-03-02 Microsoft Corporation Method and system for displaying internet shortcut icons on the desktop
US5745568A (en) 1995-09-15 1998-04-28 Dell Usa, L.P. Method of securing CD-ROM data for retrieval by one machine
US5757924A (en) 1995-09-18 1998-05-26 Digital Secured Networks Techolognies, Inc. Network security device which performs MAC address translation without affecting the IP address
US5717756A (en) 1995-10-12 1998-02-10 International Business Machines Corporation System and method for providing masquerade protection in a computer network using hardware and timestamp-specific single use keys
US5857021A (en) 1995-11-07 1999-01-05 Fujitsu Ltd. Security system for protecting information stored in portable storage media
US5764906A (en) 1995-11-07 1998-06-09 Netword Llc Universal electronic resource denotation, request and delivery system
US5771291A (en) 1995-12-11 1998-06-23 Newton; Farrell User identification and authentication system using ultra long identification keys and ultra large databases of identification keys for secure remote terminal access to a host computer
US5835718A (en) 1996-04-10 1998-11-10 At&T Corp URL rewriting pseudo proxy server
JPH09305408A (en) * 1996-05-09 1997-11-28 Hitachi Ltd Application executing method
WO1998002815A1 (en) * 1996-07-12 1998-01-22 Glenayre Electronics, Inc. Apparatus and methods for transmission security in a computer network
US5771287A (en) 1996-08-01 1998-06-23 Transcrypt International, Inc. Apparatus and method for secured control of feature set of a programmable device
US5878143A (en) * 1996-08-16 1999-03-02 Net 1, Inc. Secure transmission of sensitive information over a public/insecure communications medium
US5908469A (en) * 1997-02-14 1999-06-01 International Business Machines Corporation Generic user authentication for network computers
US6023762A (en) * 1997-07-09 2000-02-08 Northern Telecom Limited Multi-view personalized communications agent
US6314521B1 (en) * 1997-11-26 2001-11-06 International Business Machines Corporation Secure configuration of a digital certificate for a printer or other network device
US6134659A (en) * 1998-01-07 2000-10-17 Sprong; Katherine A. Controlled usage software
CA2235359C (en) * 1998-03-23 2012-04-10 Certicom Corp. Implicit certificate scheme with ca chaining
US6169976B1 (en) * 1998-07-02 2001-01-02 Encommerce, Inc. Method and apparatus for regulating the use of licensed products

Cited By (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037377A1 (en) * 2000-04-27 2001-11-01 Yumiko Nakano Information searching apparatus and method
US6925456B2 (en) * 2000-04-27 2005-08-02 Fujitsu Limited Information searching apparatus and method for online award entry
US20040078422A1 (en) * 2002-10-17 2004-04-22 Toomey Christopher Newell Detecting and blocking spoofed Web login pages
US20050188215A1 (en) * 2004-02-20 2005-08-25 Imperva, Inc. Method and apparatus for high-speed detection and blocking of zero day worm attacks
US7752662B2 (en) * 2004-02-20 2010-07-06 Imperva, Inc. Method and apparatus for high-speed detection and blocking of zero day worm attacks
US20070143288A1 (en) * 2004-02-23 2007-06-21 Kazutoshi Kichikawa Information processing apparatus, and method for retaining security
US7574440B2 (en) * 2004-02-23 2009-08-11 Dai Nippon Printing Co., Ltd. Information processing apparatus, and method for retaining security
US8776225B2 (en) * 2004-06-14 2014-07-08 Iovation, Inc. Network security and fraud detection system and method
US9118646B2 (en) * 2004-06-14 2015-08-25 Iovation, Inc. Network security and fraud detection system and method
EP1756994A2 (en) * 2004-06-14 2007-02-28 IO Vation, Inc Network security and fraud detection system and method
EP1756994A4 (en) * 2004-06-14 2009-05-27 Iovation Inc Network security and fraud detection system and method
US20050278542A1 (en) * 2004-06-14 2005-12-15 Greg Pierson Network security and fraud detection system and method
US9203837B2 (en) 2004-06-14 2015-12-01 Iovation, Inc. Network security and fraud detection system and method
JP2008503001A (en) * 2004-06-14 2008-01-31 アイオーベイション・インコーポレーテッド Network security and fraud detection system and method
US7702902B2 (en) * 2004-06-25 2010-04-20 The Go Daddy Group, Inc. Method for a web site with a proxy domain name registration to receive a secure socket layer certificate
US7707404B2 (en) * 2004-06-25 2010-04-27 The Go Daddy Group, Inc. Automated process for a web site to receive a secure socket layer certificate
US20100161971A1 (en) * 2004-06-25 2010-06-24 The Go Daddy Group, Inc. Automated process for a web site to receive a secure socket layer certificate
US8086848B2 (en) 2004-06-25 2011-12-27 The Go Daddy Group, Inc. Automated process for a web site to receive a secure socket layer certificate
US20050289084A1 (en) * 2004-06-25 2005-12-29 The Go Daddy Group, Inc. Method for a Web site with a proxy domain name registration to receive a secure socket layer certificate
US20060031492A1 (en) * 2004-06-25 2006-02-09 The Go Daddy Group, Inc. Automated process for a Web site to receive a secure socket layer certificate
US20060168116A1 (en) * 2004-06-25 2006-07-27 The Go Daddy Group, Inc. Methods of issuing a domain name certificate
US20070288520A1 (en) * 2006-06-09 2007-12-13 Bea Systems, Inc. Workflow improvements
US20070288482A1 (en) * 2006-06-09 2007-12-13 Bea Systems, Inc. RFID Workflow Client
US8120489B2 (en) 2006-06-09 2012-02-21 Oracle International Corporation Workflow improvements
US20080104684A1 (en) * 2006-10-25 2008-05-01 Iovation, Inc. Creating and verifying globally unique device-specific identifiers
US8751815B2 (en) 2006-10-25 2014-06-10 Iovation Inc. Creating and verifying globally unique device-specific identifiers
US8676684B2 (en) 2010-04-12 2014-03-18 Iovation Inc. System and method for evaluating risk in fraud prevention
US9286331B2 (en) 2010-05-06 2016-03-15 Go Daddy Operating Company, LLC Verifying and balancing server resources via stored usage data
US8468598B2 (en) 2010-08-16 2013-06-18 Sap Ag Password protection techniques using false passwords
US8843752B1 (en) * 2011-01-24 2014-09-23 Prima Cimema, Inc. Multi-factor device authentication
US8538065B2 (en) 2011-09-20 2013-09-17 Go Daddy Operating Company, LLC Systems for verifying person's identity through person's social circle using person's photograph
US8522147B2 (en) 2011-09-20 2013-08-27 Go Daddy Operating Company, LLC Methods for verifying person's identity through person's social circle using person's photograph
US8925080B2 (en) * 2011-12-20 2014-12-30 Sap Se Deception-based network security using false positive responses to unauthorized access requests
US8738604B2 (en) 2012-03-30 2014-05-27 Go Daddy Operating Company, LLC Methods for discovering sensitive information on computer networks
US8738605B2 (en) 2012-03-30 2014-05-27 Go Daddy Operating Company, LLC Systems for discovering sensitive information on computer networks
US9160809B2 (en) 2012-11-26 2015-10-13 Go Daddy Operating Company, LLC DNS overriding-based methods of accelerating content delivery
US9141669B2 (en) 2013-01-22 2015-09-22 Go Daddy Operating Company, LLC Configuring an origin server content delivery using a pulled data list
US9384208B2 (en) 2013-01-22 2016-07-05 Go Daddy Operating Company, LLC Configuring a cached website file removal using a pulled data list
US9438493B2 (en) 2013-01-31 2016-09-06 Go Daddy Operating Company, LLC Monitoring network entities via a central monitoring system
US9178888B2 (en) 2013-06-14 2015-11-03 Go Daddy Operating Company, LLC Method for domain control validation
US9521138B2 (en) 2013-06-14 2016-12-13 Go Daddy Operating Company, LLC System for domain control validation
US9141789B1 (en) 2013-07-16 2015-09-22 Go Daddy Operating Company, LLC Mitigating denial of service attacks
US10409964B2 (en) 2015-11-04 2019-09-10 Screening Room Media, Inc. Pairing devices to prevent digital content misuse
US10395011B2 (en) 2015-11-04 2019-08-27 Screening Room Media, Inc. Monitoring location of a client-side digital content delivery device to prevent digital content misuse
US10339278B2 (en) 2015-11-04 2019-07-02 Screening Room Media, Inc. Monitoring nearby mobile computing devices to prevent digital content misuse
US10417393B2 (en) 2015-11-04 2019-09-17 Screening Room Media, Inc. Detecting digital content misuse based on digital content usage clusters
US10423762B2 (en) 2015-11-04 2019-09-24 Screening Room Media, Inc. Detecting digital content misuse based on know violator usage clusters
US10430560B2 (en) 2015-11-04 2019-10-01 Screening Room Media, Inc. Monitoring digital content usage history to prevent digital content misuse
US10460083B2 (en) 2015-11-04 2019-10-29 Screening Room Media, Inc. Digital credential system
US11227031B2 (en) 2015-11-04 2022-01-18 Screening Room Media, Inc. Pairing devices to prevent digital content misuse
US11853403B2 (en) 2015-11-04 2023-12-26 Sr Labs, Inc. Pairing devices to prevent digital content misuse
US10452819B2 (en) 2017-03-20 2019-10-22 Screening Room Media, Inc. Digital credential system
US20190281064A1 (en) * 2018-03-09 2019-09-12 Microsoft Technology Licensing, Llc System and method for restricting access to web resources
US11089024B2 (en) * 2018-03-09 2021-08-10 Microsoft Technology Licensing, Llc System and method for restricting access to web resources

Also Published As

Publication number Publication date
US6442696B1 (en) 2002-08-27
GB0020380D0 (en) 2000-10-04
GB2355322A (en) 2001-04-18

Similar Documents

Publication Publication Date Title
US6442696B1 (en) System and method for extensible positive client identification
US5864683A (en) System for providing secure internetwork by connecting type enforcing secure computers to external network for limiting access to data based on user and process access rights
US9338176B2 (en) Systems and methods of identity and access management
Kesh et al. A framework for analyzing e‐commerce security
US8365266B2 (en) Trusted local single sign-on
GB2355324A (en) Transmitting protected information using a temporary file
Binduf et al. Active directory and related aspects of security
WO2001045341A2 (en) System and method for managing pseudo uniform resource locators in a security system
WO2001044904A2 (en) System and method for providing security for a network site
GB2355904A (en) Providing network site security using pseudo uniform resource locators (PURLs)
WO2003034687A1 (en) Method and system for securing computer networks using a dhcp server with firewall technology
Chadwick Threat modelling for active directory
GB2355905A (en) Providing network site security using pseudo uniform resource locators (PURLs)
WO2001044902A2 (en) System and method for extensible positive client identification
WO2001044903A2 (en) Positive information profiling system
WO2001044901A2 (en) System and method for a virtual page publication system
GB2355323A (en) Information security profile and policy system
Steinauer et al. Basic intrusion protection: the first line of defense
Schultz Human factors and information security
Elifoglu Navigating the" information super highway": How accountants can help clients assess and control the risks of Internet-based E-commerce
Cordis et al. Considerations in Mitigating Kerberos Vulnerabilities for Active Directory
Sravani nformation Systems: its Security and Control
Kossakowski et al. Securing public web servers
Colby et al. Security and Personalization
Kossakowski et al. SECURITY IMPROVEMENT MODULE CMU/SEI-SIM-011

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION