WO2004027653A2 - Detection of preselected data - Google Patents

Detection of preselected data Download PDF

Info

Publication number
WO2004027653A2
WO2004027653A2 PCT/US2003/030178 US0330178W WO2004027653A2 WO 2004027653 A2 WO2004027653 A2 WO 2004027653A2 US 0330178 W US0330178 W US 0330178W WO 2004027653 A2 WO2004027653 A2 WO 2004027653A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
content
preselected
database
computing device
Prior art date
Application number
PCT/US2003/030178
Other languages
English (en)
French (fr)
Other versions
WO2004027653A3 (en
Inventor
Kevin T. Rowney
Michael R. Wolfe
Mythili Gopalakrishnan
Vitali Fridman
Joseph Ansanelli
Original Assignee
Vontu, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/247,002 external-priority patent/US8661498B2/en
Priority claimed from US10/431,145 external-priority patent/US7673344B1/en
Priority claimed from US10/607,718 external-priority patent/US8041719B2/en
Application filed by Vontu, Inc. filed Critical Vontu, Inc.
Priority to EP03752596A priority Critical patent/EP1540542A2/de
Priority to JP2004568963A priority patent/JP4903386B2/ja
Priority to AU2003270883A priority patent/AU2003270883A1/en
Priority to CA002499508A priority patent/CA2499508A1/en
Publication of WO2004027653A2 publication Critical patent/WO2004027653A2/en
Publication of WO2004027653A3 publication Critical patent/WO2004027653A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/107Computer-aided management of electronic mailing [e-mailing]

Definitions

  • the present invention relates to the field of processing data; more particularly, the present invention relates to detecting preselected (e.g., proprietary) data in information content.
  • preselected e.g., proprietary
  • Relational structures hold data in a fashion that presents naturally intuitive ways to query the data, and has the added advantage of hiding the details of the underlying disk storage system from the user.
  • the typical applications for database systems involve the storage and retrieval of a large number of smaller pieces of data that can be naturally formatted into a table structure.
  • Relational databases have high utility because the types of queries that most people care about can be optimized using the well-known index structures outlined below.
  • B-trees are an abstract data structure based on the binary tree; B-trees must contain some copies of the data that they index; and B-trees are most efficient using the query examples outlined below.
  • Prefix queries of the form A MATCHES s* where: "s” refers to a specific string value "s*" is a regular expression e.g., Last_Name MATCHES "Smith*” [0006]
  • Information retrieval is a broad field that deals with the storage and retrieval of textual data found in documents. These systems are different from those of database systems chiefly in their focus on standard documents instead of tabular data. Early examples of this system were developed as part of the SMART system at Cornell.
  • the best-known information retrieval applications are web-based search engines like Google, Inktomi, and AltaVista. The typical way to use these systems is to find a reference to a document that is part of a larger set of digital documents.
  • the user experience for these applications usually consists of a series of queries interleaved with browsing of the results. Results of the queries are presented in order of descending relevance, and the user is able to refine the queries after further browsing.
  • the huge popularity of these systems is due to the ability of the underlying indices to deliver quick responses to the types of queries that people find most useful.
  • concordances that are built up from the collection of documents indexed. These concordances contain a data structure that lists, for each word, the location of each occurrence of that word in each of the documents. Such data structures allow quick lookups of all documents that contain a particular term. For user queries that ask for all documents that contain a collection of terms, the index is structured so that it represents a large number of vectors in Euclidean vector space of high dimension. The user's list of query terms is then also re-interpreted as a vector in this space. The query is run by finding which vectors in the document space are nearest to the query vector. This last approach has a variety of different optimizations applied to it for accuracy and speed, and is called the "cosine metric".
  • Boolean queries like: a) all documents that contain the terms “database” and “indices” b) all documents that contain “database” or “indices” but not “Sybase”
  • Link-based queries like: a) all documents that are linked to by documents that contain the term “dog” b) the most "popular” (i.e. linked to) document that contains the word "dog” [0011]
  • One of the first significant implementation proj ects of information retrieval systems is the SMART system at Cornell. This system contains many of the essential components of information retrieval systems still in use today: C. Buckley, "implementation of the SMART Information Retrieval System", Technical Report TR85- 686, Cornell University, 1985
  • the WAIS project was an early application of the massively parallel super-computer produced by Thinking Machines Inc. This is one of the first fielded information retrieval systems made available over the Internet. This primary reference source for this work is by Brewster Kahle and Art Medlar: "An Information System for Corporate Users: Wide Area Information Servers.” Technical Report TMC-199, Thinking Machines, Inc., April 1991, version 3.19.
  • Google's real break-through in search accuracy is its ability to harvest data from both the text of the documents that are indexed as well as the hyperlink structure. See Sergey Brin, Lawrence Page, "The Anatomy of a Large-Scale Hypertextual Web Search Engine", h ⁇ t ⁇ >://db ⁇ ubs.stanford.edu:8090/pub/1998-8
  • File shingling provides a very quick way to look for similarity between two documents, hi order to provide protection to a specific document (e.g., a text file) the document is shingled by hashing the document sentence-by-sentence and storing these hashed sentences in a table for quick lookup.
  • a specific document e.g., a text file
  • the same hash function is applied to each fragment of the test message to see if the fragments appear in a similar order as they do in the copyrighted content.
  • the technique is quick because the time required to lookup an individual fragment can be very fast.
  • file shingling systems are usually set up to process documents automatically and deliver the query results to a user asynchronously.
  • a typical file shingling application might be spam prevention where a set of messages is used to create an index of restricted content that an organization does not want delivered to its email systems. In this scenario, the "query" is just the automatic processing of email messages and appropriate automatic routing.
  • COPS COPS
  • SCAM SCAM
  • a variety of commercial applications referred to as content filtering systems, implement protection measures.
  • content filtering systems implement protection measures.
  • the main algorithm currently in use is pattern matching against a set of regular expressions for a set collection of text fragments that would indicate data misuse.
  • An example might be to restrict all browsing at URLs that contain the text fragment "XXX”.
  • An example for the email content control category is stopping and blocking all email that contains the words "proprietary” and "confidential” but not the words "joke” or "kidding”.
  • a method and apparatus for detecting pre-selected data stored on a personal computing device comprises monitoring messages electronically transmitted over a network for embedded preselected data and performing content searches on the messages to detect the presence of the embedded preselected data using an abstract data structure derived from the preselected data.
  • Figure 1 illustrates one embodiment of a workflow.
  • Figures 2 A and 2B illustrate exemplary modes of operation.
  • Figure 3 is a flow diagram of one embodiment of a process for protecting database data.
  • Figure 4 is a flow diagram of one embodiment of a process for indexing database data.
  • Figure 5 is a flow diagram of one embodiment of a process for searching information content for preselected data.
  • Figure 6 is a flow diagram of one embodiment of a process for finding a match for a subset of content fragments in an abstract data structure derived from preselected data.
  • Figures 7A - 7C are flow diagrams of alternate embodiments of a process for searching an incoming message using a hash table index of preselected data.
  • Figure 8 is a block diagram of an exemplary computer system that may perform one or more of the operations described herein.
  • Figure 9 is a block diagram of one embodiment of a system for client- based protection of pre-selected sensitive data.
  • Figure 10 is a flow diagram of one embodiment of a process for client- based protection of pre-selected sensitive data.
  • a system and methodology is described herein to track and monitor the use of sensitive information anywhere on a personal computing device.
  • this monitoring is implemented by performing content searches of data storage media of a personal computing device such as a desktop computer or a portable computer.
  • the monitoring is implemented by performing content searches on messages as they are transmitted from or received by the personal computing device.
  • the monitoring is implemented by performing content searches before, during, and after the use of potentially sensitive information inside any application running on the personal computing device.
  • the system described herein is able to detect this information in a secure and scalable fashion that is capable of handling large amounts of the database data.
  • Database data may comprise any form of tabular-formatted data stored in a variety of systems including, but not limited to, relational databases, spreadsheets, flat files, etc.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
  • ROM read only memory
  • RAM random access memory
  • magnetic disk storage media includes magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
  • the system to perform the detection scheme described herein consists of two main components: a Policy Management System (PMS) and a Message Monitoring System (MMS).
  • PMS Policy Management System
  • MMS Message Monitoring System
  • the PMS is responsible for accepting user input that determines information security policies for the use and transmission of data (e.g., database data) that is contained inside messages sent over the network or is stored in data storage media of the personal computing devices such as portable computers, desktop computers, Personal Digital Assistants, cell-phones, etc. This data is, thus, preselected.
  • data storage media of a personal computing device refers to any storage within the personal computing device or accessible to the personal computing device that may store, temporarily or permanently, data for the personal computing device.
  • the MMS is responsible for performing content searches on messages sent over the network, data processed by personal computing devices, or data stored on data storage media of personal computing devices, and is responsible for implementing the policy identified to the PMS by the user.
  • both of these systems are coupled to a computer network that communicates any of the standard protocols for the exchange of information.
  • a user may decide to implement a given policy that restricts the use or transmission of database data by certain individuals and then manually enters this policy into the PMS using a graphical- user-interface and one or more user input devices (e.g., a mouse, a keyboard, etc.).
  • the user interface receives the input and may be running on a computer system with the PMS or on a separate machine.
  • An example policy might be to stop a given group of individuals in customer service from saving a data file containing pre-selected data to a removable media device attached to a personal computing device.
  • the policy includes the nature of protection desired (e.g., restrict only a subset of employees), the type of data that requires protection (e.g., database data), and the network location (e.g., database table name, IP address of server, server or file name) of the database data that requires protection. Again, all of this information may be specified using a standard graphical user interface that prompts the user to enter the specific information in the correct fields.
  • the PMS queries the database and extracts copies of the database data that is to be protected and derives from that data an abstract data structure (hereafter called the "index") that is described in more detail below.
  • index abstract data structure
  • the PMS then sends this index, along with the particulars on the policy that is to be implemented, to the MMS so that it can begin to enforce that policy.
  • the MMS receives the index from the PMS together with the details on the policy to be enforced.
  • the MMS uses the index and the policy information to enforce the policy specified by the user, hi one embodiment, the MMS uses this index to search each of the outgoing messages (e.g., email messages, web mail messages, etc.) for the database data that is to be protected, as will be discussed in greater detail below.
  • the MMS uses this index to search contents of data storage media of a personal computing device and/or the content of interactions between the user and the personal computing device for the database data that is to be protected, as will be discussed in more detail below.
  • the Message Monitoring System can be configured in one of two ways: “surveillance mode", and “enforcement mode”.
  • Figure 2 illustrates two network configurations.
  • surveillance mode the MMS is placed somewhere on the network where it can watch traffic and report on violations of policy, but it is specifically not configured to block messages as they leave. This is shown in Figure 2A where the PMS has access to information.
  • the PMS is coupled to the Internet via a switch, a tap and a firewall.
  • the MMS monitors the network messages using the tap.
  • “enforcement mode” the MMS is able to watch traffic and report on violations, but it can also intercept and re-route messages so that their ultimate destination is changed.
  • the MMS monitors traffic using a series of servers and re-routes traffic to, for example, certain servers, if the MMS determines messages are likely to contain preselected information.
  • the MMS may use different servers for each of the various layer protocols.
  • Message re-routing is not mandatory.
  • the MMS can be configured to just intercept and stop the outgoing message.
  • An example policy in "enforcement mode" would be to route all messages that violate a policy to the manager of the person that violates the policy so that appropriate disciplinary action can take place.
  • the MMS is actively parsing messages that are transported using various application layer protocols (e.g., SMTP, HTTP, FTP, AIM, ICQ, SOAP, etc.).
  • application layer protocols e.g., SMTP, HTTP, FTP, AIM, ICQ, SOAP, etc.
  • the two subsystems run on one
  • the PMS and MMS may be incorporated into the same physical or logical system. This consolidated configuration is more appropriate for reasons of control cost of goods required to produce the system.
  • the PMS and MMS may not necessarily reside on the same LAN.
  • the PMS may reside on the same LAN as the database information, but the MMS may reside on a different LAN that is separated from the LAN on which PMS resides.
  • the two distinct LANs may ultimately be coupled together via the Internet but separated by firewalls, routers, and/or other network devices.
  • FIG. 3 is a flow diagram of one embodiment of a process for protecting database data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic monitors messages for pre-selected data (processing block 301). Next, processing logic determines whether a message has pre-selected data (processing block 302). If not, processing transitions to processing block 301. If so, processing logic determines if the individual sending/receiving message is authorized to send/receive the information in the message (processing block 303). If so, the process ends and processing transitions to processing block 301. If not, processing logic takes one or more actions such as intercepting the message, re-routing the message, logging the message, etc. (processing block 304) and processing transitions to processing block 301. Client-Based Mode of Operation
  • the client-based mode of operation is directed to monitoring actions taken by a user of a personal computing device to detect user operations that may involve a potential misuse of data. These user operations may include, for example, saving or accessing restricted database data on any storage device on the computing system, using restricted database data in an application, printing restricted database data, using restricted database data in any network communication protocol, etc.
  • the monitoring of user actions is performed by parsing and searching the content that is either accessed or saved onto the local storage system of the personal computing device, or transported using various application layer protocols (e.g., SMTP, HTTP, FTP, ATM, ICQ, SOAP, etc.)
  • the monitoring of user actions is performed by intercepting and interpreting the data exchanged between the user and the personal computing device.
  • Figure 9 is a block diagram of one embodiment of a system for client- based protection of pre-selected sensitive data.
  • a server 902 communicates with client computers
  • the network 906 may be a private network (e.g., a local area network (LAN)) or a public network (e.g., a wide area network (WAN)).
  • the clients 910 are computers belonging to different employees within an organization. Each client 910 maybe, for example, a desktop computer, a portable computer (e.g., a laptop), or any other computer that may operate with intermittent network connectivity.
  • a content monitoring system (also referred to herein as message monitoring system or MMS) 912 resides on each client 912 and is responsible for searching contents of data storage media of this client for pre-selected sensitive data and for intercepting and interpreting content exchanged between the user and the client 912.
  • the data storage media may include, for example, a main memory, a static memory, a mass storage memory (e.g., a hard disk), or any other storage device that may store, temporarily or permanently, files or other documents for the client computer.
  • the MMS 912 monitors specific data operations such as file-reads, file- writes, file-updates, and read and writes to removable media devices (e.g., floppy drives, universal serial bus (USB) devices, compact disk recordable (CDR) drives, etc.).
  • removable media devices e.g., floppy drives, universal serial bus (USB) devices, compact disk recordable (CDR) drives, etc.
  • the operation of the MMS 912 facilitates the prevention of sensitive data loss via removable and mobile devices.
  • the operation of the MMS 912 may prevent the escape of sensitive data that occurs if the user copies the sensitive data stored on the client 910 to a floppy disk, moves a file with the sensitive data to a USB-based removable memory device, prints or emails the sensitive data from the laptop or desktop computer, uses the sensitive data in an unauthorized application, etc.
  • the server 902 is responsible for configuring the detection scheme described herein within the organization.
  • the server 902 contains a PMS 904 and a message collector 914.
  • the PMS 904 maintains a set of security policies controlling the use of sensitive data.
  • the set of security policies may identify employees whose computers need to be monitored for a potential misuse of sensitive data, specify the sensitive data for which searches are to be performed, and define the scope of the searches (e.g., specific storage medium, data operations, etc.). Based on this information, the PMS 904 instructs each MMS 912 as to whether a corresponding client 910 is to be searched and sends the index that is to be used for searching.
  • the index is derived from the specific sensitive data that is pre-selected for one or more clients 912 based on the security policies.
  • the message collector 914 is responsible for collecting messages received from the MMSes 912 that notify of data misuses by the users of the clients 910.
  • each MMS 912 can operate in a stand-alone fashion when it cannot maintain network contact with the server 902 (e.g., if a laptop 910 is taken home for the weekend, moved to another network, stolen, etc.). For example, if the user disconnects the laptop 910 from the network 906, the MMS 912 running on the laptop 910 may perform periodic content searches of the data storage media of the laptop 910 while the user works on the laptop at home.
  • the MMS 912 may search the local file system of the laptop 910, an email message archive, etc.
  • the MMS 912 may monitor specific data operations (e.g., file-reads, file-writes, file-updates, and reads and writes to removable media devices such as floppy drives) if instructed by the PMS 904.
  • specific data operations e.g., file-reads, file-writes, file-updates, and reads and writes to removable media devices such as floppy drives
  • the MMS 912 detects the pre-selected data on any data storage medium of the client 910, it creates a message containing a notification of the detection of the pre-selected data, and places this message into a transmission queue. Subsequently, when the network connectivity is re-established, messages from the transmission queue are sent to the message collector 914.
  • the policies maintained by the PMS 904 require that MMS 912 prevent access to the preselected data once the pre-selected data is detected.
  • FIG 10 is a flow diagram of one embodiment of a process for personal computing device-based protection of pre-selected sensitive data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides on a personal computing device such as a client 910.
  • processing logic receives instructions defining the scope of content searches that are to be performed on the personal computing device (processing block 1002).
  • the instructions identify data storage media that are required to be searched and the periodicity of the search.
  • the instructions also identify the data operations that are to be monitored for the presence of pre-selected sensitive data.
  • processing logic receives an abstract data structure or index derived from the pre-selected sensitive data (processing block 1004). Some embodiments of the abstract data structure are discussed in greater detail below.
  • processing logic searches the contents of the data storage media of the personal computing device for the pre-selected sensitive data using the abstract data structure.
  • the scope of the content search is defined by the instructions received from the server.
  • the searching may apply to contents of data storage media of this personal computing device data and/or content exchanged between the user and the personal computing device.
  • the content search is performed periodically, at predefined time intervals.
  • processing logic searches volatile memory devices to detect use of pre-selected data by applications running on the personal computing device. Once the use is detected, processing logic identifies the application using the pre-selected data. [0062] If processing logic detects the presence of the pre-selected data (or its portion) (processing box 1008), it then determines whether policies maintained by the PMS require blocking of access to the pre-selected data (processing box 1009). In one embodiment, the access to the detected data is blocked for the application attempting to access this data.
  • processing logic blocks the access to the preselected data (processing block 1010) and further determines whether the personal computing device can maintain network contact with the server or any other designated device (processing box 1011). If this determination is positive, processing logic sends a message containing a notification of the detection to the server (processing block 1012).
  • the notification may identify the personal computing device and the detected data, h one embodiment, the notification identifies the application that was using the preselected data when running on the personal computing device.
  • processing logic places this message into a queue for future transmission to the server when the network connectivity is re-established (processing block 1014).
  • processing logic places this message into a queue for future transmission to the server when the network connectivity is re-established (processing block 1014).
  • the personal computing device-based monitoring allows for surveillance of content stored on and processed by a personal computing device.
  • the content searching described herein addresses the problem of specifically searching for traces of pre-selected database data inside a file-system, memory bank, or data in the process of being accessed by an application.
  • the client-based protection of sensitive data monitors content stored in the personal computing device after this content has been downloaded or otherwise accessed through an access control system.
  • Desktop-based encryption decryption packages systems which typically rely on server-based mechanisms to encrypt the data and desktop-based mechanisms to decrypt it for viewing, help prevent the misuse of data by restricting the access to the cryptographic keys that decrypt the data.
  • the client-based protection of sensitive data that is described herein can be used to protect the data that is left "in the clear" outside of a cryptographic envelope and is, therefore, vulnerable to theft by third parties.
  • the client-based protection of sensitive data that is described herein is directed to detecting the presence of pre-selected database data, and not the presence of the hidden code.
  • driver-filters which are forms of software written to drive the operation of hardware using a content filter that monitors all content sent to that personal computing device, they lack the' ability to perform searches of data storage media of the personal computing device for pre-selected database data.
  • the security properties of this system are paramount.
  • the chief objective of this system is to enforce security policies that are pertinent to database data. This implies that the system must be very secure in the manner in which it handles database data. If, in the process of protecting the database data, the system opens up new avenues to steal database data, then its ultimate purpose is defeated.
  • the MMS is deployed in such a way as to monitor and/or block the largest number of messages flowing through the network.
  • the MMS may be installed either behind or in front-of one of these points of concentration on the network.
  • Such placement of the system affords it an exceptional view of message and increases its utility for the organization using the system.
  • such placement also makes the MMS highly vulnerable to network-based attacks (commonly called "hacking") in which a third party uses unauthorized network access to violate the security perimeter surrounding the network to steal the data contained inside the network.
  • the MMS is deployed locally on a personal computing device and is responsible for performing surveillance on the use of local storage media, on the use of classified data by applications running on the personal computing device, and on network communications to and from the device.
  • Such placement of the system affords it an exceptional view of the information accessed and used by the person operating the computing device and increases its utility for the organization using the system.
  • Such placement makes the MMS vulnerable to "hacking" attacks by the same employees who are being monitored by the MMS.
  • the PMS's security concerns are also high in that its software directly queries the information sources in order to build the index that the MMS utilizes.
  • the placement of the MMS on the network, in one embodiment, or on a personal computing device, in another embodiment, makes it exposed to attacks.
  • These attacks can come, in one embodiment, from inside the Local Area Network (LAN) or from outside the LAN via the WAN and/or Internet link that the organization maintains.
  • the attacks can come from users of a personal computing device.
  • the specific security concern here is that the MMS may contain valuable database data from the relational database that it is trying to protect.
  • hackers or users of personal computing devices may try to steal the data from the MMS instead of trying to steal it from the more-thoroughly guarded computer on which the relational database actually runs.
  • a second and related security concern for the application arises in the case when the MMS is deployed at a different LAN from that in which the PMS is deployed. As mentioned above, this may be an important configuration to help implement security policy across two organizations that share database data. Here again, the information stored in the MMS is subjected to information security threats.
  • Various embodiments treat these security threats directly.
  • One aspect of novelty of these embodiments described herein is that the PMS MMS pair that exchanges indices that contain no copies of the data that it is seeking to protect.
  • the PMS sends abstract data structures derived from the database data to the MMS so that it can enforce policy.
  • One possible approach to achieve this protection is to simply copy the database into the MMS, or (equivalently from a security perspective) allow the MMS to directly query the database in order to check that the content is consistent with policy.
  • the problem with this approach is that it introduces significant security vulnerabilities where there were none before. In this insecure approach, the cure is worse than the disease.
  • the PMS creates an index from the database that contains no copies of the database data, or contains only encrypted or hashed copies of database data.
  • Such an index may be created using a tuple-storage mechanism that provides a data structure for storing multiple tuples associated with fragments of the database data.
  • the tuple-storage mechanism include a hash table, a vector, an array, a tree, a list, or a table in a relational database management system.
  • the data stored in the indices only retains the relative placement of the elements in the database in relation to other elements.
  • the index may store, for each fragment of the database data (e.g., a data fragment inside a database cell), the fragment's hash code together with its row number, column number and type of the column.
  • the system is still carefully avoiding storing copies of data of less-common terms of higher value (e.g., credit card numbers, SSN, uncommon names, etc.).
  • the system avoids storing any copies of sensitive information by storing only hash codes and tuples of information related to the placement of cells in the database.
  • the process of preselected data detection includes two major operations, or phases: indexing, and searching.
  • indexing phase the system builds indices from the preselected data.
  • the preselected data may be any data whose relations allow it to be structured in a tabular format.
  • the preselected data may be stored in a tabular format (e.g., data in a relational database, data in an Excel spreadsheet, etc.) or it may be stored in a non-tabular format but have such relationships as to allow it to be stored in a tabular format (e.g., data stored as comma separated values in a flat file or a password database, relational data in an object-oriented database, etc.).
  • FIG. 4 is a flow diagram of one embodiment of a process for indexing the preselected data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic begins with determining whether the preselected data is stored in a standard tabular format (processing box 402). If not, processing logic converts the preselected data into a standard tabular format (processing block 404). Each cell in the resulting table stores a fragment of the preselected data. In one embodiment, each data fragment is a token.
  • a token may be a single word or a cluster of words (e.g., words enclosed in quotation marks). For example, while the word
  • this may represent a token stored in a database cell
  • this token may also represent a standalone token if it is stored as a single string in a database cell.
  • processing logic creates a tuple-storage structure derived from the preselected data (processing block 406).
  • a tuple-storage structure provides a mechanism for storing multiple tuples associated with the fragments of the preselected data.
  • Examples of tuple-storage structures include a hash table, a vector, an array, a tree or a list. Each type of the tuple-storage structure is associated with a method for retrieving a set of tuples for any given content fragment (the set of tuples may be empty if no match is found in the tuple-storage structure).
  • processing logic stores information about the position of each data fragment within the database in a corresponding tuple (processing block 408).
  • the information about the position of a data fragment includes the number of a row storing the data fragment in the database. In another embodiment, this information also includes the number of a column storing the data fragment in the database and optionally the data type o the column.
  • processing logic sorts the tuples in a predetermined order
  • processing block 410 (e.g., in the ascending lexicographic order) (processing block 410).
  • the resulting abstract data structure (i.e., the index) only contains information about the relative placement of data records in the context of the larger whole but does not include any fragments of the preselected data itself.
  • the contents of the index are treated cryptographically
  • FIG 5 is a flow diagram of one embodiment of a process for searching information content for preselected data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software
  • processing logic begins with receiving information content (processing block 502).
  • the information content may be included in a file (e.g., an archived email message stored on a hard drive of a computer) or in a block of data transmitted over a network (e.g., an email message transmitted over a network using any type of a network protocol).
  • processing logic detects in the information content a sequence of content fragments that may possibly contain a portion of preselected data (processing block 504).
  • the preselected data may be proprietary database data that needs to be protected or any other kind of data that has an inherent tabular structure. That is, the preselected data may either be stored in a tabular format (e.g., data in a relational database, data in an Excel spreadsheet, etc.) or it may be stored in a non-tabular format but have such relations as to allow it to be stored in a tabular format (e.g., data stored as comma separated values in a flat files or a password database, relational data in an object-oriented database, etc.).
  • the detected sequence of content fragments is a set of adjacent tokens within the information content. Each token may correspond to either a word or a phrase. The detected sequence of content fragments may be a portion of the received information content or the entire information content.
  • processing logic decides that a sequence of content fragments may possibly contain a portion of the preselected data upon determining that the sequence of content fragments resembles column-formatted data. This determination may be made by parsing the received information content to identify separated lines (as may be indicated, for example, by tags ⁇ cr> or ⁇ cr> ⁇ lf ) and finding that these separated lines contain a similar number of tokens and optionally the similar data types of the tokens.
  • processing logic decides that a sequence of content fragments may possibly contain a portion of the preselected data upon parsing the entire information content and searching blocks of contiguous tokens for preselected data.
  • the blocks of contiguous tokens are defined based on user- specified parameters such as a user-specified width of each block and a user-specified position of each block within the information content (e.g., the user may require that the two adjacent blocks be separated by a certain number of tokens).
  • processing logic decides that a sequence of content fragments may possibly contain a portion of the preselected data upon finding in the information content an expression of a predefined format.
  • Such expression may be, for example, an account number, a social security number, a credit card number, a phone number, a postal code, an email address, text formatting indicating a monetary or numeric value (e.g., "$" signs together with digits), etc.
  • processing logic decides that a region of text surrounding the expression may possibly contain a portion of the preselected data. The size of this region may be defined by a predetermined number of tokens on each side of the found expression.
  • processing logic decides that a sequence of content fragments may possibly contain a portion of the preselected data upon determining that the word usage or the word distribution in the information content (or in some portion of the information content) resembles a statistical pattern that indicates a possible containment of the preselected data in the information content.
  • processing logic decides that a sequence of content fragments may possibly contain a portion of the preselected data upon determining that certain properties associated with the received information content indicate a possible containment of the preselected data in the information content based on the history of previous violations.
  • these properties may include, for example, the destination of the information content (e.g., a recipient of an electronic message), the origin of the information content, the time of transmission associated with the information content, the size of transmission associated with the information content, the types of files contained in the transmission (e.g., multipurpose Internet mail extension (MIME) types of files), etc.
  • the history of previous violations is maintained by identifying, for each detection of preselected data, the properties of the information content in which the preselected data was detected and recording these properties in a previous violation database. Subsequently, when processing logic decides whether a sequence of content fragments within the new information content may possibly contain a portion of preselected data, processing logic identifies the properties of the new information content and searches the previous violation database for these properties.
  • processing logic determines whether the previous violations associated with the matching property indicate a possible containment of preselected data in the new information content. This indication may be based on the number of previous violations associated with the matching property or the frequency of previous violations associated with the matching property. For example, this indication may be based upon the total number of violations that a particular sender has committed, or the frequency of those violations over a given time period. [0097] Afterwards, upon detecting a sequence of content fragments that may possibly contain a portion of the preselected data, processing logic makes a determination as to whether any subset of these content fragments matches a subset of the preselected data (processing block 506). This determination is made using an index (also referred to herein as an abstract data structure) that defines the tabular structure of the preselected data.
  • an index also referred to herein as an abstract data structure
  • FIG. 6 is a flow diagram of one embodiment of a process for finding a match for a subset of content fragments in an abstract data structure derived from preselected data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic begins with parsing the sequence of content fragments identified at processing block 504 of Figure 5 into content fragments (e.g., tokens). Then, for each content fragment, processing logic searches the abstract data structure for a set of matching tuples (processing block 602).
  • a word "Smith" contained in the information content may have several occurrences in the preselected data that are reflected in the abstract data structure. Specifically, each of these occurrences has a corresponding tuple in the abstract data structure.
  • processing logic retrieves a set of tuples corresponding to the occurrences of the word "Smith" in the preselected data.
  • Each tuple stores information about the position of this data fragment within a database or a table storing the preselected data.
  • the positional information includes the row number of a cell storing the data fragment.
  • the positional information also includes a column number of this cell and optionally the data type of the column.
  • processing logic combines the matching tuple sets found for all the content fragments (processing block 604) and then groups the combined matching tuple sets by row numbers into groups L (processing block 606).
  • each group L (referred to herein as an accumulator) contains matching tuple sets that all have the same column number, i.e., the matching tuple sets in each group L correspond to fragments of the preselected data that all appear to be from the same row in the database.
  • processing logic sorts the groups L by the number of matching tuple sets contained in each group (processing block 608) and, in one embodiment, selects those groups that have tuple sets with distinct column numbers (processing block 610).
  • processing logic determines whether any of the selected groups has a sufficiently large number of matching tuple sets (processing block 612). For example, if the number of matching tuple sets in one group exceeds "3", then there is a high likelihood that the information content does include data from four or more columns of the same row in the database.
  • FIGS 7A - 7C are flow diagrams of alternate embodiments of a process for searching an incoming message using a hash table index of preselected data.
  • the process is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic begins with parsing an incoming message (processing block 702).
  • processing logic determines whether the parsed portions of the incoming message contain column-formatted data (processing box 704).
  • lexical analysis may be used to identify lines in the parsed portions of the incoming message (e.g., by finding tags ⁇ cr> or ⁇ cr> ⁇ lf that are used to separate lines) and then detecting that the number of tokens found in adjacent lines is identical in number and in type.
  • processing logic stores the type of each token, along with the total number of tokens.
  • processing transitions to processing block 702. Otherwise, processing transitions to processing block 706 where processing logic sets i equal to the first line that resembles column- formatted data.
  • processing logic applies a hash function H(k) to each token in line i
  • processing block 708 finds a set of tuples at H(k) in the hash table for each token in line i, adds the tuples to list L, and regroups list L into a set of accumulators (processing block 712) in which each individual accumulator's tuples have the same row number value. Further, processing logic sorts that list L by the length of each Ai (processing block 714) and checks for unique occurrences of columns in sorted list L (processing block 716). At processing block 710, optional pre-processing logic may be performed to filter the tokens before insertion into list L so that only those tuples with type matching the lexical type of the original token k are added to L.
  • tuples are simple "singletons" containing row numbers only (i.e., no column number and no type indicator.) [00106] Afterwards, if the incoming message contains more lines that resemble column-formatted data (processing box 718), processing logic increments i to the next line that resembles column-formatted data (processing block 722) and the process transitions to processing block 706. Otherwise, processing logic reports lines of text with Ai that exceed the predetermined size and have unique column numbers (processing block 720).
  • processing logic begins with receiving user- specified parameters of "width” (W) and "jump” (J) (processing block 732) and parsing an incoming message (processing block 734).
  • W specifies the number of contiguous tokens in each block of contiguous tokens that is to be searched during a single iteration
  • J specifies the required number of tokens between the two adjacent blocks.
  • processing logic sets the value of the location variable (S t ) to zero
  • processing logic applies a hash function H(k) to each token in the textblock (processing block 740), finds a set of tuples at H(k) in the hash table for each token in the textblock, adds the tuples that have the same type as the corresponding tokens in the textblock to list L (processing block 742), regroups list L into a set of accumulators (processing block 744), sorts that list L by the length of each Ai (processing block 746) and checks for unique occurrences of columns in sorted list L (processing block 748). [00110] Afterwards, processing logic increments St by J number of tokens
  • processing logic begins with parsing an incoming message (processing block 764) and looking for a first expression having a user-specified format (processing block 766).
  • Such expression may be, for example, an account number, a social security number, a credit card number, text formatting indicating a monetary or numeric value (e.g., "$" signs together with digits), etc.
  • the process transitions to processing block 764. Otherwise, the process transitions to processing block 768 where processing logic defines a block ("textblock") to be searched by collecting W contiguous tokens before and after the matching expression.
  • the textblock may consist of 10 tokens immediately preceding the matching expression, the matching expression itself and 10 tokens immediately following the matching expression.
  • processing logic applies a hash function H(k) to each token in the textblock (processing block 770), finds a set of tuples at H(k) in the hash table for each token in the textblock, adds the tuples that have the same type as the corresponding tokens in the textblock to list L (processing block 772), regroups list L into a set of accumulators (processing block 774), sorts that list L by the length of each At (processing block 776) and checks for unique occurrences of columns in sorted list L (processing block 778).
  • H(k) hash function
  • processing logic determines whether the message has anymore expressions of the user-specified format (processing box 780). If this determination is positive, the process transitions to processing block 768. Otherwise, processing logic reports textblocks with Ai that exceed the predetermined size and have unique column numbers (processing block 782).
  • the PMS is positioned on a corporate network so that secure communications can occur with an organization's database (in which the records reside that require protection.)
  • the MMS is positioned so that it can monitor and/or intercept all outbound email communications of the organization.
  • a collision list holds multiple such records of row#, col #, and type.
  • the MMS After the MMS receives the index, it parses the message and re-creates the hash table in memory in the same fashion as it was created in the PMS. [00117] As the MMS picks up outbound email messages and parses them, it uses this index in the manner described below to detect if any of these emails contain data from the database. This is done by parsing each individual line of text from the email messages. This may involve decoding the surrounding file types and converting everything into raw text (e.g., stripping all formatting information from a Microsoft Word file and leaving only the text itself.) This series of lines of text is then parsed into individual words by looking for separation marks like the "space" character, or other forms of punctuation. These words are text tokens.
  • each collision list is itself a set of data elements that store possible row number, column number, and type triplets. If the union of all triplets from all collision lists is taken, and if a set of triplets is found with all with the same row number, but with distinct column numbers, then with high probability this line of text from the email message contains a record from the database.
  • the term "tuple” used herein is not limited to the specific case of the triplets of row number, column number, and type and may refer to data structures that do not contain all of these three parameters. For example, in one embodiment, a tuple contains the row number but not the column number and the type of the database data.
  • Database query mechanisms are significantly different from the teachings described herein.
  • One difference is that B-trees actually contain fragments of the database tables that they index.
  • the reason that this is important is that — as mentioned above - the MMS has to have a copy of the index in order to protect the data from escape; however the MMS is also best deployed in a position in the network where it may be exposed to significant threats. Keeping the index that the MMS uses free of any components of the database data is a key requirement.
  • Such a message will contain plenty of records from the core database that requires protection, e.g., first name, last name, social-security number, etc., but could also contain information not in the core database tables.
  • a typical example is information that is "joined" from other databases.
  • Another example is simple line formatting tokens that separate fields of database data. Because of the possibility of this extra data that's typically found on each of these lines, the standard predicate logic connectives like AND and OR applied to each token on the line of an outgoing message produce either too many hits (as is the case with OR) or zero hits (as is the case with AND).
  • the system is able to detect the presence of n or more tokens that are all from the same row of a database table, even in the case where n is much smaller than the total number of tokens in the line. This is another significant difference between the present invention and the prior art mentioned above for database and document query mechanisms.
  • the indices for these systems contain (inside the concordances) the same terms that are stored in the database that is to be protected.
  • this index since the system deploys this index into a location on the network that is potentially under hacker threat; this is a definite disadvantage.
  • these query systems run Boolean queries using the forms of predicate logic like AND and OR. As mentioned above, this approach is at a distinct disadvantage for detecting database records that have been possibly "joined" with extraneous data from other tables.
  • the technique of file shingling is similar to, but substantially different from the technique described herein.
  • file shingling In file shingling, the subject of interest is text data (prose, software, outlines, etc.). In the techniques described here, the focus is on protecting database data. One difference is that database data from a given database table may appear with the row order or column order permuted arbitrarily in the test message. These permutations are the simple result of the query mechanisms typically applied to extract database data. A database query could result in a block of database data that comes in arbitrary column order, and arbitrary row order. For this reason, the basic technique of file shingling will not work if applied to database data. File shingling assumes that the same linear sequence is followed between the protected document and the test document.
  • Internet content filtering systems are based on keyword searches.
  • the novel techniques described above build an abstract data structure from the database data that it seeks to protect. This abstract data structure does not contain fragments of the text it is trying to protect.
  • a keyword filtering system must contain some representation of the text that it is searching for in order to run its queries.
  • These Internet content filtering systems are not intended to protect database data. Using regular expression matching to detect violations of an organizations privacy policy on database data will also lead to a very inaccurate method of detection.
  • These systems are primarily applied to stop employee abuse of the Internet as it relates to pornographic or abusive content and language. Such systems, if applied to the protection of database data, would use regular expressions to match database records. This would also result in transferring fragments of the database data to the computer on the network where security risks are maximized.
  • FIG. 8 is a block diagram of an exemplary computer system that may perform one or more of the operations described herein.
  • computer system 800 may comprise an exemplary client 850 or server 800 computer system.
  • Computer system 800 comprises a communication mechanism or bus 811 for communicating information, and a processor 812 coupled with bus 811 for processing information.
  • Processor 812 includes a microprocessor, but is not limited to a microprocessor, such as, for example, Pentium , PowerPC , Alpha , etc.
  • System 800 further comprises a random access memory (RAM), or other dynamic storage device 804 (referred to as main memory) coupled to bus 811 for storing information and instructions to be executed by processor 812.
  • Main memory 804 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 812.
  • Computer system 800 also comprises a read only memory (ROM) and/or other static storage device 806 coupled to bus 811 for storing static information and instructions for processor 812, and a data storage device 807, such as a magnetic disk or optical disk and its corresponding disk drive. Data storage device 807 is coupled to bus 811 for storing information and instructions.
  • Computer system 800 may further be coupled to a display device 821, such as a cathode ray tube (CRT) or liquid crystal display (LCD), coupled to bus 811 for displaying information to a computer user.
  • An alphanumeric input device 822 including alphanumeric and other keys, may also be coupled to bus 811 for communicating information and command selections to processor 812.
  • cursor control 823 such as a mouse, trackball, trackpad, stylus, or cursor direction keys, coupled to bus 811 for communicating direction information and command selections to processor 812, and for controlling cursor movement on display 821.
  • cursor control 823 such as a mouse, trackball, trackpad, stylus, or cursor direction keys
  • bus 811 for communicating direction information and command selections to processor 812, and for controlling cursor movement on display 821.
  • cursor control 823 such as a mouse, trackball, trackpad, stylus, or cursor direction keys
  • bus 811 Another device that may be coupled to bus 811 for communicating direction information and command selections to processor 812, and for controlling cursor movement on display 821.
  • hard copy device 824 which maybe used for printing instructions, data, or other information on a medium such as paper, film, or similar types of media.
  • a sound recording and playback device such as a speaker and/or microphone may optionally be coupled to bus 811 for audio interfacing with computer system 800.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Computer Hardware Design (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Storage Device Security (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
PCT/US2003/030178 2002-09-18 2003-09-17 Detection of preselected data WO2004027653A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP03752596A EP1540542A2 (de) 2002-09-18 2003-09-17 Erkennung von vorher ausgewählten daten
JP2004568963A JP4903386B2 (ja) 2002-09-18 2003-09-17 事前選択されたデータに関し探索可能な情報コンテンツ
AU2003270883A AU2003270883A1 (en) 2002-09-18 2003-09-17 Detection of preselected data
CA002499508A CA2499508A1 (en) 2002-09-18 2003-09-17 Detection of preselected data

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US10/247,002 2002-09-18
US10/247,002 US8661498B2 (en) 2002-09-18 2002-09-18 Secure and scalable detection of preselected data embedded in electronically transmitted messages
US10/431,145 US7673344B1 (en) 2002-09-18 2003-05-06 Mechanism to search information content for preselected data
US10/431,145 2003-05-07
US10/607,718 US8041719B2 (en) 2003-05-06 2003-06-27 Personal computing device-based mechanism to detect preselected data
US10/607,718 2003-06-27

Publications (2)

Publication Number Publication Date
WO2004027653A2 true WO2004027653A2 (en) 2004-04-01
WO2004027653A3 WO2004027653A3 (en) 2004-09-30

Family

ID=32034172

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2003/030178 WO2004027653A2 (en) 2002-09-18 2003-09-17 Detection of preselected data

Country Status (5)

Country Link
EP (1) EP1540542A2 (de)
JP (1) JP4903386B2 (de)
AU (1) AU2003270883A1 (de)
CA (1) CA2499508A1 (de)
WO (1) WO2004027653A2 (de)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2422455A (en) * 2005-01-24 2006-07-26 Hewlett Packard Development Co Securing the privacy of sensitive information in a data-handling system
US7673344B1 (en) 2002-09-18 2010-03-02 Symantec Corporation Mechanism to search information content for preselected data
US7886359B2 (en) 2002-09-18 2011-02-08 Symantec Corporation Method and apparatus to report policy violations in messages
US7996374B1 (en) 2008-03-28 2011-08-09 Symantec Corporation Method and apparatus for automatically correlating related incidents of policy violations
US7996385B2 (en) 2002-09-18 2011-08-09 Symantec Corporation Method and apparatus to define the scope of a search for information from a tabular data source
US7996373B1 (en) 2008-03-28 2011-08-09 Symantec Corporation Method and apparatus for detecting policy violations in a data repository having an arbitrary data schema
US8011003B2 (en) 2005-02-14 2011-08-30 Symantec Corporation Method and apparatus for handling messages containing pre-selected data
US8041719B2 (en) 2003-05-06 2011-10-18 Symantec Corporation Personal computing device-based mechanism to detect preselected data
US8613040B2 (en) 2008-12-22 2013-12-17 Symantec Corporation Adaptive data loss prevention policies
US8661498B2 (en) 2002-09-18 2014-02-25 Symantec Corporation Secure and scalable detection of preselected data embedded in electronically transmitted messages
KR20160005127A (ko) * 2013-03-13 2016-01-13 페이스북, 인크. 짧은 용어 해쉬

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8225371B2 (en) 2002-09-18 2012-07-17 Symantec Corporation Method and apparatus for creating an information security policy based on a pre-configured template
JP2008276580A (ja) * 2007-04-27 2008-11-13 Kddi Corp 電子システム、電子機器、ウィルスパターン管理装置、プログラム、および記録媒体
US8065739B1 (en) 2008-03-28 2011-11-22 Symantec Corporation Detecting policy violations in information content containing data in a character-based language
US8826443B1 (en) 2008-09-18 2014-09-02 Symantec Corporation Selective removal of protected content from web requests sent to an interactive website
US8935752B1 (en) 2009-03-23 2015-01-13 Symantec Corporation System and method for identity consolidation
US10025500B2 (en) 2011-10-28 2018-07-17 Blackberry Limited Systems and methods of using input events on electronic devices
CN107636671A (zh) * 2015-03-26 2018-01-26 诺基亚通信公司 优化通信中的数据检测
US11645368B2 (en) * 2016-12-30 2023-05-09 Google Llc Hash-based dynamic restriction of content on information resources

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5577249A (en) * 1992-07-31 1996-11-19 International Business Machines Corporation Method for finding a reference token sequence in an original token string within a database of token strings using appended non-contiguous substrings
US5796948A (en) * 1996-11-12 1998-08-18 Cohen; Elliot D. Offensive message interceptor for computers
US5835722A (en) * 1996-06-27 1998-11-10 Logon Data Corporation System to control content and prohibit certain interactive attempts by a person using a personal computer
US5884033A (en) * 1996-05-15 1999-03-16 Spyglass, Inc. Internet filtering system for filtering data transferred over the internet utilizing immediate and deferred filtering actions
GB2343030A (en) * 1998-09-04 2000-04-26 Int Computers Ltd Multiple string search using hash value pointer array
US6442607B1 (en) * 1998-08-06 2002-08-27 Intel Corporation Controlling data transmissions from a computer

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5701464A (en) * 1995-09-15 1997-12-23 Intel Corporation Parameterized bloom filters
JP2002189643A (ja) * 2000-08-31 2002-07-05 Lucent Technol Inc 通信トラヒックを走査するための方法および装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5577249A (en) * 1992-07-31 1996-11-19 International Business Machines Corporation Method for finding a reference token sequence in an original token string within a database of token strings using appended non-contiguous substrings
US5884033A (en) * 1996-05-15 1999-03-16 Spyglass, Inc. Internet filtering system for filtering data transferred over the internet utilizing immediate and deferred filtering actions
US5835722A (en) * 1996-06-27 1998-11-10 Logon Data Corporation System to control content and prohibit certain interactive attempts by a person using a personal computer
US5796948A (en) * 1996-11-12 1998-08-18 Cohen; Elliot D. Offensive message interceptor for computers
US6442607B1 (en) * 1998-08-06 2002-08-27 Intel Corporation Controlling data transmissions from a computer
GB2343030A (en) * 1998-09-04 2000-04-26 Int Computers Ltd Multiple string search using hash value pointer array

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8661498B2 (en) 2002-09-18 2014-02-25 Symantec Corporation Secure and scalable detection of preselected data embedded in electronically transmitted messages
US7673344B1 (en) 2002-09-18 2010-03-02 Symantec Corporation Mechanism to search information content for preselected data
US7886359B2 (en) 2002-09-18 2011-02-08 Symantec Corporation Method and apparatus to report policy violations in messages
US7996385B2 (en) 2002-09-18 2011-08-09 Symantec Corporation Method and apparatus to define the scope of a search for information from a tabular data source
US9515998B2 (en) 2002-09-18 2016-12-06 Symantec Corporation Secure and scalable detection of preselected data embedded in electronically transmitted messages
US8041719B2 (en) 2003-05-06 2011-10-18 Symantec Corporation Personal computing device-based mechanism to detect preselected data
GB2422455A (en) * 2005-01-24 2006-07-26 Hewlett Packard Development Co Securing the privacy of sensitive information in a data-handling system
US8046592B2 (en) 2005-01-24 2011-10-25 Hewlett-Packard Development Company, L.P. Method and apparatus for securing the privacy of sensitive information in a data-handling system
US8011003B2 (en) 2005-02-14 2011-08-30 Symantec Corporation Method and apparatus for handling messages containing pre-selected data
US7996374B1 (en) 2008-03-28 2011-08-09 Symantec Corporation Method and apparatus for automatically correlating related incidents of policy violations
US7996373B1 (en) 2008-03-28 2011-08-09 Symantec Corporation Method and apparatus for detecting policy violations in a data repository having an arbitrary data schema
US8613040B2 (en) 2008-12-22 2013-12-17 Symantec Corporation Adaptive data loss prevention policies
KR20160005127A (ko) * 2013-03-13 2016-01-13 페이스북, 인크. 짧은 용어 해쉬
KR101962715B1 (ko) 2013-03-13 2019-03-27 페이스북, 인크. 짧은 용어 해쉬
US10318652B2 (en) 2013-03-13 2019-06-11 Facebook, Inc. Short-term hashes

Also Published As

Publication number Publication date
AU2003270883A1 (en) 2004-04-08
JP4903386B2 (ja) 2012-03-28
JP2005539334A (ja) 2005-12-22
EP1540542A2 (de) 2005-06-15
AU2003270883A8 (en) 2004-04-08
WO2004027653A3 (en) 2004-09-30
CA2499508A1 (en) 2004-04-01

Similar Documents

Publication Publication Date Title
US8041719B2 (en) Personal computing device-based mechanism to detect preselected data
US8312553B2 (en) Mechanism to search information content for preselected data
US9515998B2 (en) Secure and scalable detection of preselected data embedded in electronically transmitted messages
US8595849B2 (en) Method and apparatus to report policy violations in messages
US8566305B2 (en) Method and apparatus to define the scope of a search for information from a tabular data source
US8813176B2 (en) Method and apparatus for creating an information security policy based on a pre-configured template
US8011003B2 (en) Method and apparatus for handling messages containing pre-selected data
EP1853976B1 (de) Verfahren und vorrichtung zur handhabung von nachrichten mit vorausgewählten daten
US20060184549A1 (en) Method and apparatus for modifying messages based on the presence of pre-selected data
US9313232B2 (en) System and method for data mining and security policy management
JP4903386B2 (ja) 事前選択されたデータに関し探索可能な情報コンテンツ
US7996373B1 (en) Method and apparatus for detecting policy violations in a data repository having an arbitrary data schema
EP1613020B1 (de) Verfahren und System zum Ermitteln, wenn eine abgehende Kommunikation bestimmte Inhalte enthält
US20090205051A1 (en) Systems and methods for securing data in electronic communications
US20130246338A1 (en) System and method for indexing a capture system
US8365247B1 (en) Identifying whether electronic data under test includes particular information from a database
JP2006155535A (ja) 個人情報探索プログラム,個人情報管理システムおよび個人情報管理機能付き情報処理装置
JP2007280412A (ja) 個人情報探索プログラム
JP2008090867A (ja) 個人情報探索プログラム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2499508

Country of ref document: CA

Ref document number: 2004568963

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2003752596

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2003752596

Country of ref document: EP

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)