US20050132197A1 - Method and apparatus for a character-based comparison of documents - Google Patents

Method and apparatus for a character-based comparison of documents Download PDF

Info

Publication number
US20050132197A1
US20050132197A1 US10/845,648 US84564804A US2005132197A1 US 20050132197 A1 US20050132197 A1 US 20050132197A1 US 84564804 A US84564804 A US 84564804A US 2005132197 A1 US2005132197 A1 US 2005132197A1
Authority
US
United States
Prior art keywords
document
tokens
hash values
signature
contained
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/845,648
Inventor
Art Medlar
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NortonLifeLock Inc
Original Assignee
Symantec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Symantec Corp filed Critical Symantec Corp
Priority to US10/845,648 priority Critical patent/US20050132197A1/en
Assigned to SYMANTEC CORPORATION reassignment SYMANTEC CORPORATION MERGER (SEE DOCUMENT FOR DETAILS). Assignors: BRIGHTMAIL, INC.
Publication of US20050132197A1 publication Critical patent/US20050132197A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/212Monitoring or handling of messages using filtering or selective blocking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/06Message adaptation to terminal or network requirements
    • H04L51/063Content adaptation, e.g. replacement of unsuitable content

Definitions

  • the present invention relates to data processing; more particularly, the present invention relates to a character-based comparison of documents.
  • the Internet is growing in popularity, and more and more people are conducting business over the Internet, advertising their products and services by generating and sending electronic mass mailings.
  • These electronic messages are usually unsolicited and regarded as nuisances by the recipients because they occupy much of the storage space needed for the necessary and important data processing.
  • a mail server may have to reject accepting an important and/or desired email when its storage capacity is filled to the maximum with the unwanted emails containing advertisements.
  • thin client systems such as set top boxes, PDA's, network computers, and pagers all have limited storage capacity. Unwanted emails in any one of such systems can tie up a finite resource for the user.
  • a typical user wastes time by downloading voluminous but useless advertisement information. These unwanted emails are commonly referred to as spam.
  • a spam block method exists which keeps an index list of all spam agents (i.e., companies that generate mass unsolicited e-mails), and provides means to block any e-mail sent from a company on the list.
  • Another “junk mail” filter currently available employs filters which are based on predefined words and patterns as mentioned above. An incoming mail is designated as an unwanted mail, if the subject contains a known spam pattern.
  • the method includes dividing a first document into tokens. Each token includes a predefined number of sequential characters from the first document. The method further includes calculating hash values for the tokens and creating, for the first document, a signature including a subset of hash values from the calculated hash values and additional information pertaining to the tokens of the first document. The signature of the first document is subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
  • FIG. 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail.
  • FIG. 2 is a block diagram of one embodiment of a spam content preparation module.
  • FIG. 3 is a block diagram of one embodiment of a similarity determination module.
  • FIG. 4 is a flow diagram of one embodiment of a process for handling a spam message.
  • FIG. 5 is a flow diagram of one embodiment of a process for filtering email spam based on similarities measures.
  • FIG. 6A is a flow diagram of one embodiment of a process for creating a signature of an email message.
  • FIG. 6B is a flow diagram of one embodiment of a process for detecting spam using a signature of an email message.
  • FIG. 7 is a flow diagram of one embodiment of a process for a character-based comparison of documents.
  • FIG. 8 is a flow diagram of one embodiment of a process for determining whether two documents are similar.
  • FIG. 9 is a flow diagram of one embodiment of a process for reducing noise in an email message.
  • FIG. 10 is a flow diagram of one embodiment of a process for modifying an email message to reduce noise.
  • FIG. 11 is a block diagram of an exemplary computer system.
  • the present invention also relates to apparatus for performing the operations herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
  • FIG. 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail (email).
  • the system includes a control center 102 coupled to a communications network 100 such as a public network (e.g., the Internet, a wireless network, etc.) or a private network (e.g., LAN, Intranet, etc.).
  • the control center 102 communicates with multiple network servers 104 via the network 100 .
  • Each server 104 communicates with user terminals 106 using a private or public network.
  • the control center 102 is an anti-spam facility that is responsible for analyzing messages identified as spam, developing filtering rules for detecting spam, and distributing the filtering rules to the servers 104 .
  • a message may be identified as spam because it was sent by a known spam source (as determined, for example, using a “spam probe”, i.e., an email address specifically selected to make its way into as many spammer mailing lists as possible).
  • a server 104 may be a mail server that receives and stores messages addressed to users of corresponding user terminals sent. Alternatively, a server 104 may be a different server coupled to the mail server 104 . Servers 104 are responsible for filtering incoming messages based on the filtering rules received from the control center 102 .
  • control center 102 includes a spam content preparation module 108 that is responsible for generating data characterizing the content associated with a spam attack and sending this data to the servers 104 .
  • Each server 104 includes a similarity determination module 110 that is responsible for storing spam data received from the control center 102 and identifying incoming email messages resembling the spam content using the stored data.
  • each server 104 hosts both the spam content preparation module 108 that generates data characterizing the content associated with a spam attack and the similarity determination module 110 that uses the generated data to identify email messages resembling the spam content.
  • FIG. 2 is a block diagram of one embodiment of a spam content preparation module 200 .
  • the spam content preparation module 200 includes a spam content parser 202 , a spam data generator 206 , and a spam data transmitter 208 .
  • the spam content parser 202 is responsible for parsing the body of email messages resulting from spam attacks (referred to as spam messages).
  • the spam data generator 206 is responsible for generating data characterizing a spam message.
  • data characterizing a spam message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the spam message.
  • Data characterizing a spam message or any other email message is referred to herein as a message signature.
  • Signatures of spam messages or any other email messages may contain various data identifying the message content and may be created using various algorithms that enable the use of similarity measures in comparing signatures of different email messages.
  • the spam content preparation module 200 also includes a noise reduction algorithm 204 that is responsible for detecting data indicative of noise and removing the noise from spam messages prior to generating signatures of spam messages. Noise represents data invisible to a recipient that was added to a spam message to hide its spam nature.
  • the spam content preparation module 200 also includes a message grouping algorithm (not shown) that is responsible for grouping messages originated from a single spam attack. Grouping may be performed based on specified characteristics of spam messages (e.g., included URLs, message parts, etc.). If grouping is used, the spam data generator 206 may generate a signature for a group of spam messages rather than for each individual message.
  • a message grouping algorithm (not shown) that is responsible for grouping messages originated from a single spam attack. Grouping may be performed based on specified characteristics of spam messages (e.g., included URLs, message parts, etc.). If grouping is used, the spam data generator 206 may generate a signature for a group of spam messages rather than for each individual message.
  • the spam data transmitter 208 is responsible for distributing signatures of spam messages to participating servers such as servers 104 of FIG. 1 .
  • each server 104 periodically (e.g., each 5 minutes) initiates a connection (e.g., a secure HTTPS connection) with the call center 102 .
  • a connection e.g., a secure HTTPS connection
  • signatures are transmitted from the call center 102 to the relevant server 106 .
  • FIG. 3 is a block diagram of one embodiment of a similarity determination module 300 .
  • the similarity determination module 300 includes an incoming message parser 302 , a spam data receiver 306 , a message data generator 310 , a resemblance identifier 312 , and a spam database 304 .
  • the incoming message parser 302 is responsible for parsing the body of incoming email messages.
  • the spam data receiver 306 is responsible for receiving signatures of spam messages and storing them in the spam database 304 .
  • the message data generator 310 is responsible for generating signatures of incoming email messages.
  • a signature of an incoming email message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message.
  • a signature of an incoming email message includes various other data characterizing the content of the email message (e.g., a subset of token sets composing the incoming email message).
  • signatures of email messages may be created using various algorithms that allow for use of similarity measures in comparing signatures of different email messages.
  • the similarity determination module 300 also includes an incoming message cleaning algorithm 308 that is responsible for detecting data indicative of noise and removing the noise from the incoming email messages prior to generating their signatures, as will be discussed in more detail below.
  • the resemblance identifier 312 is responsible for comparing the signature of each incoming email message with the signatures of spam messages stored in the spam database 304 and determining, based on this comparison, whether an incoming email message is similar to any spam message.
  • the spam database 304 stores signatures generated for spam messages before they undergo the noise reduction process (i.e., noisy spam messages) and signatures generated for these spam messages after they undergo the noise reduction process (i.e., spam message with reduced noise).
  • the message data generator 310 first generates a signature of an incoming email message prior to noise reduction, and the resemblance identifier 312 compares this signature with the signatures of noisy spam messages. If this comparison indicates that the incoming email message is similar to one of these spam messages, then the resemblance identifier 312 marks this incoming email message as spam. Alternatively, the resemblance identifier 312 invokes the incoming message cleaning algorithm 308 to remove noise from the incoming email message. Then, the message data generator 310 generates a signature for the modified incoming message, which is then compared by the resemblance identifier 312 with the signatures of spam messages with reduced noise.
  • FIG. 4 is a flow diagram of one embodiment of a process 400 for handling a spam message.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a control center 102 of FIG. 1 .
  • process 400 begins with processing logic receiving a spam message (processing block 402 ).
  • processing logic modifies the spam message to reduce noise.
  • One embodiment of a noise reduction algorithm will be discussed in more detail below in conjunction with FIGS. 9 and 10 .
  • processing logic generates a signature of the spam message.
  • a signature of the spam message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message, as will be discussed in more detail below in conjunction with FIG. 6A .
  • a signature of an incoming email message includes various other data characterizing the content of the email message.
  • processing logic transfers the signature of the spam message to a server (e.g., a server 104 of FIG. 1 ), which uses the signature of the spam message to find incoming email messages resembling the spam message (block 410 ).
  • a server e.g., a server 104 of FIG. 1
  • FIG. 5 is a flow diagram of one embodiment of a process 500 for filtering email spam based on similarities measures.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a server 104 of FIG. 1 .
  • process 500 begins with processing logic receiving an incoming email message (processing block 502 ).
  • processing logic modifies the incoming message to reduce noise.
  • noise reduction algorithm One embodiment of a noise reduction algorithm will be discussed in more detail below in conjunction with FIGS. 9 and 10 .
  • processing logic generates a signature of the incoming message based on the content of the incoming message.
  • a signature of an incoming email message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message, as will be discussed in more detail below in conjunction with FIG. 6A .
  • a signature of an incoming email message includes various other data characterizing the content of the email message.
  • processing compares the signature of the incoming messages with signatures of spam messages.
  • processing logic determines that the resemblance between the signature of the incoming message and a signature of some spam message exceeds a threshold similarity measure.
  • a threshold similarity measure One embodiment of a process for determining the resemblance between two messages is discussed in more detail below in conjunction with FIG. 6B .
  • processing logic marks the incoming email message as spam.
  • FIG. 6A is a flow diagram of one embodiment of a process 600 for creating a signature of an email message.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a server 104 of FIG. 1 .
  • process 600 begins with processing logic dividing an email message into sets of tokens (processing block 602 ).
  • Each set of tokens may include a predefined number of sequential units from the email message. The predefined number may be equal to, or greater than, 1.
  • a unit may represent a character, a word or a line in the email message.
  • each set of tokens is combined with the number of occurrences of this set of tokens in the email message.
  • processing logic calculates hash values for the sets of tokens.
  • a hash value is calculated by applying a hash function to each combination of a set of tokens and a corresponding token occurrence number.
  • processing logic creates a signature for the email message using the calculated hash values.
  • the signature is created by selecting a subset of calculated hash values and adding a parameter characterizing the email message to the selected subset of calculated hash values.
  • the parameter may specify, for example, the size of the email message, the number of calculated hash values, the keyword associated with the email message, the name of an attachment file, etc.
  • a signature for an email message is created using a character-based document comparison mechanism that will be discussed in more detail below in conjunction with FIGS. 7 and 8 .
  • FIG. 6B is a flow diagram of one embodiment of a process 650 for detecting spam using a signature of an email message.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic resides at a server 104 of FIG. 1 .
  • process 650 compares data in a signature of an incoming email message with data in a signature of each spam message.
  • the signature data includes a parameter characterizing the content of an email message and a subset of hash values generated for the tokens contained in the email message.
  • the parameter may specify, for example, the size of the email message, the number of tokens in the email message, the keyword associated with the email message, the name of an attachment file, etc.
  • Processing logic begins with comparing a parameter in a signature of the incoming email message with a corresponding parameter in a signature of each spam message (processing block 652 ).
  • a decision box 654 processing logic determines whether any spam message signatures contain a parameter similar to the parameter of the incoming message signature.
  • the similarity may be determined, for example, based on the allowed difference between the two parameters or the allowed ratio of the two parameters.
  • processing logic decides that the incoming email message is legitimate (i.e., it is not spam) (processing block 662 ).
  • processing logic determines whether the signature of he first spam message has hash values similar to the hash values in the signature of the incoming email (decision box 656 ). Based on the similarity threshold, the hash values may be considered similar if, for example, a certain number of them matches or the ratio of matched and unmatched hash values exceeds a specified threshold.
  • processing logic decides that the incoming email message is spam (processing block 670 ). Otherwise, processing logic further determines if there are more spam message signatures with the similar parameter (decision box 658 ). If so, processing logic determines whether the next spam message signature has hash values similar to the hash values of the incoming email signature (decision box 656 ). If so, processing logic decides that the incoming email message is spam (processing block 670 ). If not, processing logic returns to processing block 658 .
  • processing logic determines that no other spam message signatures have the similar parameter, then it decides that the incoming mail message is not spam (processing block 662 ).
  • FIG. 7 is a flow diagram of one embodiment of a process 700 for a character-based comparison of documents.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • process 700 begins with processing logic pre-processing a document (processing block 702 ).
  • the document is pre-processed by changing each upper case alphabetic character within the document to a lower case alphabetic character. For example, the message “I am Sam, Sam I am.” may be pre-processed into an expression “i.am.sam.sam.i.am”.
  • processing logic divides the document into tokens, with each token including a predefined number of sequential characters from the document.
  • each token is combined with its occurrence number. This combination is referred to as a labeled shingle. For example, if the predefined number of sequential characters in the token is equal to 3, the expression specified above includes the following set of labeled shingles:
  • the shingles are represented as a histogram.
  • processing logic calculates hash values for the tokens.
  • the hash values are calculated for the labeled shingles. For example, if a hashing function H(x) is applied to each labeled shingle illustrated above, the following results are produced:
  • processing logic then sorts the hash values as follows:
  • processing logic selects a subset of hash values from the calculated hash values.
  • processing logic creates a signature of the document by adding to the sketch a parameter pertaining to the tokens of the document.
  • the parameter specifies the number of original tokens in the document. In the example above, the number of original tokens is 15 .
  • the signature of the document can be expressed as follows:
  • FIG. 8 is a flow diagram of one embodiment of a process 800 for determining whether two documents are similar.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • process 800 begins with processing logic comparing the token numbers specified in the signatures of documents 1 and 2, and determining whether the token number in the first signature is within the allowed range with respect to the token number from the second signature (decision box 802 ).
  • the allowed range may be a difference of 1 or less or a ratio of 90 percent or higher.
  • processing logic decides that documents 1 and 2 are different (processing block 808 ). Otherwise, if the token number in the first signature is within the allowed range with respect to the token number from the second signature, processing logic determines whether the resemblance between hash values in signatures 1 and 2 exceeds a threshold (e.g., more than 95 percent of hash values are the same) (decision box 804 ). If so, processing logic decides that the two documents are similar (processing block 806 ). If not, processing logic decides that documents 1 and 2 are different (processing block 808 ).
  • a threshold e.g., more than 95 percent of hash values are the same
  • FIG. 9 is a flow diagram of one embodiment of a process 900 for reducing noise in an email message.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • process 900 begins with processing logic detecting in an email message data indicative of noise (processing block 902 ).
  • noise represents data that is invisible to a recipient of the mail message and was added to the email message to avoid spam filtering.
  • data may include, for example, formatting data (e.g., HTML tags), numeric character references, character entity references, URL data of predefined categories, etc.
  • Numeric character references specify the code position of a character in the document character set.
  • Character entity references use symbolic names so that authors need not remember code positions. For example, the character entity reference &aring refers to the lowercase “a” character topped with a ring.
  • processing logic modifies the content of the email message to reduce the noise.
  • the content modification includes removing formatting data, translating numeric character references and charcater entity references to their ASCII equivalents, and modifying URL data.
  • processing logic compares the modified content of the email message with the content of a spam message. In one embodiment, the comparison is performed to identify an exact match. Alternatively, the comparison is performed to determine whether the two documents are similar.
  • FIG. 10 is a flow diagram of one embodiment of a process 1000 for modifying an email message to reduce noise.
  • the process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • processing logic may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • process 1000 begins with processing logic searching an email message for formatting data (e.g., HTML tags) (processing block 1002 ).
  • formatting data e.g., HTML tags
  • processing logic determines whether the found formatting data qualifies as an exception.
  • HTML formatting does not add anything to the information content of a message.
  • exceptions are the tags that contain useful information for further processing of the message (e.g., tags ⁇ BODY>, ⁇ A>, ⁇ IMG>, and ⁇ FONT>).
  • tags ⁇ BODY>, ⁇ A>, ⁇ IMG>, and ⁇ FONT> are needed for “white on white” text elimination, and the ⁇ A> and ⁇ IMG> tags typically contain link information that may be used for passing data to other components of the system.
  • the formatting data is extracted from the email message (processing block 1006 ).
  • processing logic converts each numerical character reference and character entity reference into a corresponding ASCII character (processing block 1008 ).
  • numeric character references may take two forms:
  • the string “&” corresponds to the string “&” in ASCII
  • the string “#” corresponds to the string “#” in ASCII
  • the string “3” corresponds to 3 in ASCII
  • the string “#56;” corresponds to 8 in ASCII
  • “#59;” corresponds to the string “;” in ASCII.
  • processing logic checks whether the converted data still includes numeric character references or character entity references (decision box 1010 ). If the check is positive, processing logic repeats the conversion operation at processing block 1008 . Otherwise, processing logic proceeds to processing block 1012 .
  • processing logic modifies URL data of predefined categories. These categories may include, for example, numerical character references contained in the URL that are converted by processing logic into corresponding ASCII characters.
  • URL “password” syntax may be used to add characters before an “@” in the URL hostname. These characters are ignored by the target web server but they add significant amounts of noise information to each URL.
  • Processing logic modifies the URL data by removing these additional characters. Finally, processing logic removes the “query” part of the URL, following a string “?” at the end of the URL.
  • Processing logic modifies the above URL data into http://www.foo.coni/bar.html.
  • FIG. 11 is a block diagram of an exemplary computer system 1100 that may be used to perform one or more of the operations described herein.
  • the machine may comprise a network router, a network switch, a network bridge, Personal Digital Assistant (PDA), a cellular telephone, a web appliance or any machine capable of executing a sequence of instructions that specify actions to be taken by that machine.
  • PDA Personal Digital Assistant
  • the computer system 1100 includes a processor 1102 , a main memory 1104 and a static memory 1106 , which communicate with each other via a bus 1108 .
  • the computer system 1100 may further include a video display unit 1110 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 1100 also includes an alpha-numeric input device 1112 (e.g., a keyboard), a cursor control device 1114 (e.g., a mouse), a disk drive unit 1116 , a signal generation device 1120 (e.g., a speaker) and a network interface device 1122 .
  • the disk drive unit 1116 includes a computer-readable medium 1124 on which is stored a set of instructions (i.e., software) 1126 embodying any one, or all, of the methodologies described above.
  • the software 1126 is also shown to reside, completely or at least partially, within the main memory 1104 and/or within the processor 1102 .
  • the software 1126 may further be transmitted or received via the network interface device 1122 .
  • the term “computer-readable medium” shall be taken to include any medium that is capable of storing or encoding a sequence of instructions for execution by the computer and that cause the computer to perform any one of the methodologies of the present invention.
  • the term “computer-readable medium” shall accordingly be taken to included, but not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals.

Abstract

A method and system for a character-based document comparison are described. In one embodiment, the method includes dividing a first document into tokens. Each token includes a predefined number of sequential characters from the first document. The method further includes calculating hash values for the tokens and creating, for the first document, a signature including a subset of hash values from the calculated hash values and additional information pertaining to the tokens of the first document. The signature of the first document is subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.

Description

    RELATED APPLICATIONS
  • The present application claims priority to U.S. Provisional Application Ser. No. 60/471,242, filed May 15, 2003, which is incorporated herein in its entirety.
  • FIELD OF THE INVENTION
  • The present invention relates to data processing; more particularly, the present invention relates to a character-based comparison of documents.
  • BACKGROUND OF THE INVENTION
  • The Internet is growing in popularity, and more and more people are conducting business over the Internet, advertising their products and services by generating and sending electronic mass mailings. These electronic messages (emails) are usually unsolicited and regarded as nuisances by the recipients because they occupy much of the storage space needed for the necessary and important data processing. For example, a mail server may have to reject accepting an important and/or desired email when its storage capacity is filled to the maximum with the unwanted emails containing advertisements. Moreover, thin client systems such as set top boxes, PDA's, network computers, and pagers all have limited storage capacity. Unwanted emails in any one of such systems can tie up a finite resource for the user. In addition, a typical user wastes time by downloading voluminous but useless advertisement information. These unwanted emails are commonly referred to as spam.
  • Presently, there are products that are capable of filtering out unwanted messages. For example, a spam block method exists which keeps an index list of all spam agents (i.e., companies that generate mass unsolicited e-mails), and provides means to block any e-mail sent from a company on the list.
  • Another “junk mail” filter currently available employs filters which are based on predefined words and patterns as mentioned above. An incoming mail is designated as an unwanted mail, if the subject contains a known spam pattern.
  • However, as spam filtering grows in sophistication, so do the techniques of spammers in avoiding the filters. Examples of tactics incorporated by recent generation of spammers include randomization, origin concealment, and filter evasion using HTML.
  • SUMMARY OF THE INVENTION
  • A method and system for a character-based comparison of documents are described. According to one aspect, the method includes dividing a first document into tokens. Each token includes a predefined number of sequential characters from the first document. The method further includes calculating hash values for the tokens and creating, for the first document, a signature including a subset of hash values from the calculated hash values and additional information pertaining to the tokens of the first document. The signature of the first document is subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
  • Other features of the present invention will be apparent from the accompanying drawings and from the detailed description that follows.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which, however, should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
  • FIG. 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail.
  • FIG. 2 is a block diagram of one embodiment of a spam content preparation module.
  • FIG. 3 is a block diagram of one embodiment of a similarity determination module.
  • FIG. 4 is a flow diagram of one embodiment of a process for handling a spam message.
  • FIG. 5 is a flow diagram of one embodiment of a process for filtering email spam based on similarities measures.
  • FIG. 6A is a flow diagram of one embodiment of a process for creating a signature of an email message.
  • FIG. 6B is a flow diagram of one embodiment of a process for detecting spam using a signature of an email message.
  • FIG. 7 is a flow diagram of one embodiment of a process for a character-based comparison of documents.
  • FIG. 8 is a flow diagram of one embodiment of a process for determining whether two documents are similar.
  • FIG. 9 is a flow diagram of one embodiment of a process for reducing noise in an email message.
  • FIG. 10 is a flow diagram of one embodiment of a process for modifying an email message to reduce noise.
  • FIG. 11 is a block diagram of an exemplary computer system.
  • DETAILED DESCRIPTION OF THE PRESENT INVENTION
  • A method and apparatus for a character-based comparison of documents are described. In the following description, numerous details are set forth. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
  • Some portions of the detailed descriptions which follow are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • The present invention also relates to apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will appear from the description below. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
  • A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.); etc.
  • Filtering Email Spam Based on Similarity Measures
  • FIG. 1 is a block diagram of one embodiment of a system for controlling delivery of spam electronic mail (email). The system includes a control center 102 coupled to a communications network 100 such as a public network (e.g., the Internet, a wireless network, etc.) or a private network (e.g., LAN, Intranet, etc.). The control center 102 communicates with multiple network servers 104 via the network 100. Each server 104 communicates with user terminals 106 using a private or public network.
  • The control center 102 is an anti-spam facility that is responsible for analyzing messages identified as spam, developing filtering rules for detecting spam, and distributing the filtering rules to the servers 104. A message may be identified as spam because it was sent by a known spam source (as determined, for example, using a “spam probe”, i.e., an email address specifically selected to make its way into as many spammer mailing lists as possible).
  • A server 104 may be a mail server that receives and stores messages addressed to users of corresponding user terminals sent. Alternatively, a server 104 may be a different server coupled to the mail server 104. Servers 104 are responsible for filtering incoming messages based on the filtering rules received from the control center 102.
  • In one embodiment, the control center 102 includes a spam content preparation module 108 that is responsible for generating data characterizing the content associated with a spam attack and sending this data to the servers 104. Each server 104 includes a similarity determination module 110 that is responsible for storing spam data received from the control center 102 and identifying incoming email messages resembling the spam content using the stored data.
  • In an alternative embodiment, each server 104 hosts both the spam content preparation module 108 that generates data characterizing the content associated with a spam attack and the similarity determination module 110 that uses the generated data to identify email messages resembling the spam content.
  • FIG. 2 is a block diagram of one embodiment of a spam content preparation module 200. The spam content preparation module 200 includes a spam content parser 202, a spam data generator 206, and a spam data transmitter 208.
  • The spam content parser 202 is responsible for parsing the body of email messages resulting from spam attacks (referred to as spam messages).
  • The spam data generator 206 is responsible for generating data characterizing a spam message. In one embodiment, data characterizing a spam message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the spam message. Data characterizing a spam message or any other email message is referred to herein as a message signature. Signatures of spam messages or any other email messages may contain various data identifying the message content and may be created using various algorithms that enable the use of similarity measures in comparing signatures of different email messages.
  • In one embodiment, the spam content preparation module 200 also includes a noise reduction algorithm 204 that is responsible for detecting data indicative of noise and removing the noise from spam messages prior to generating signatures of spam messages. Noise represents data invisible to a recipient that was added to a spam message to hide its spam nature.
  • In one embodiment, the spam content preparation module 200 also includes a message grouping algorithm (not shown) that is responsible for grouping messages originated from a single spam attack. Grouping may be performed based on specified characteristics of spam messages (e.g., included URLs, message parts, etc.). If grouping is used, the spam data generator 206 may generate a signature for a group of spam messages rather than for each individual message.
  • The spam data transmitter 208 is responsible for distributing signatures of spam messages to participating servers such as servers 104 of FIG. 1. In one embodiment, each server 104 periodically (e.g., each 5 minutes) initiates a connection (e.g., a secure HTTPS connection) with the call center 102. Using this pull-based connection, signatures are transmitted from the call center 102 to the relevant server 106.
  • FIG. 3 is a block diagram of one embodiment of a similarity determination module 300. The similarity determination module 300 includes an incoming message parser 302, a spam data receiver 306, a message data generator 310, a resemblance identifier 312, and a spam database 304.
  • The incoming message parser 302 is responsible for parsing the body of incoming email messages.
  • The spam data receiver 306 is responsible for receiving signatures of spam messages and storing them in the spam database 304.
  • The message data generator 310 is responsible for generating signatures of incoming email messages. In some embodiments, a signature of an incoming email message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message. In other embodiments, a signature of an incoming email message includes various other data characterizing the content of the email message (e.g., a subset of token sets composing the incoming email message). As discussed above, signatures of email messages may be created using various algorithms that allow for use of similarity measures in comparing signatures of different email messages.
  • In one embodiment, the similarity determination module 300 also includes an incoming message cleaning algorithm 308 that is responsible for detecting data indicative of noise and removing the noise from the incoming email messages prior to generating their signatures, as will be discussed in more detail below.
  • The resemblance identifier 312 is responsible for comparing the signature of each incoming email message with the signatures of spam messages stored in the spam database 304 and determining, based on this comparison, whether an incoming email message is similar to any spam message.
  • In one embodiment, the spam database 304 stores signatures generated for spam messages before they undergo the noise reduction process (i.e., noisy spam messages) and signatures generated for these spam messages after they undergo the noise reduction process (i.e., spam message with reduced noise). In this embodiment, the message data generator 310 first generates a signature of an incoming email message prior to noise reduction, and the resemblance identifier 312 compares this signature with the signatures of noisy spam messages. If this comparison indicates that the incoming email message is similar to one of these spam messages, then the resemblance identifier 312 marks this incoming email message as spam. Alternatively, the resemblance identifier 312 invokes the incoming message cleaning algorithm 308 to remove noise from the incoming email message. Then, the message data generator 310 generates a signature for the modified incoming message, which is then compared by the resemblance identifier 312 with the signatures of spam messages with reduced noise.
  • FIG. 4 is a flow diagram of one embodiment of a process 400 for handling a spam message. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a control center 102 of FIG. 1.
  • Referring to FIG. 4, process 400 begins with processing logic receiving a spam message (processing block 402).
  • At processing block 404, processing logic modifies the spam message to reduce noise. One embodiment of a noise reduction algorithm will be discussed in more detail below in conjunction with FIGS. 9 and 10.
  • At processing block 406, processing logic generates a signature of the spam message. In one embodiment, a signature of the spam message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message, as will be discussed in more detail below in conjunction with FIG. 6A. In other embodiments, a signature of an incoming email message includes various other data characterizing the content of the email message.
  • At processing block 408, processing logic transfers the signature of the spam message to a server (e.g., a server 104 of FIG. 1), which uses the signature of the spam message to find incoming email messages resembling the spam message (block 410).
  • FIG. 5 is a flow diagram of one embodiment of a process 500 for filtering email spam based on similarities measures. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a server 104 of FIG. 1.
  • Referring to FIG. 5, process 500 begins with processing logic receiving an incoming email message (processing block 502).
  • At processing block 504, processing logic modifies the incoming message to reduce noise. One embodiment of a noise reduction algorithm will be discussed in more detail below in conjunction with FIGS. 9 and 10.
  • At processing block 506, processing logic generates a signature of the incoming message based on the content of the incoming message. In one embodiment, a signature of an incoming email message includes a list of hash values calculated for sets of tokens (e.g., characters, words, lines, etc.) composing the incoming email message, as will be discussed in more detail below in conjunction with FIG. 6A. In other embodiments, a signature of an incoming email message includes various other data characterizing the content of the email message.
  • At processing block 508, processing compares the signature of the incoming messages with signatures of spam messages.
  • At processing block 510, processing logic determines that the resemblance between the signature of the incoming message and a signature of some spam message exceeds a threshold similarity measure. One embodiment of a process for determining the resemblance between two messages is discussed in more detail below in conjunction with FIG. 6B.
  • At processing block 512, processing logic marks the incoming email message as spam.
  • FIG. 6A is a flow diagram of one embodiment of a process 600 for creating a signature of an email message. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a server 104 of FIG. 1.
  • Referring to FIG. 6A, process 600 begins with processing logic dividing an email message into sets of tokens (processing block 602). Each set of tokens may include a predefined number of sequential units from the email message. The predefined number may be equal to, or greater than, 1. A unit may represent a character, a word or a line in the email message. In one embodiment, each set of tokens is combined with the number of occurrences of this set of tokens in the email message.
  • At processing block 604, processing logic calculates hash values for the sets of tokens. In one embodiment, a hash value is calculated by applying a hash function to each combination of a set of tokens and a corresponding token occurrence number.
  • At processing block 606, processing logic creates a signature for the email message using the calculated hash values. In one embodiment, the signature is created by selecting a subset of calculated hash values and adding a parameter characterizing the email message to the selected subset of calculated hash values. The parameter may specify, for example, the size of the email message, the number of calculated hash values, the keyword associated with the email message, the name of an attachment file, etc.
  • In one embodiment, a signature for an email message is created using a character-based document comparison mechanism that will be discussed in more detail below in conjunction with FIGS. 7 and 8.
  • FIG. 6B is a flow diagram of one embodiment of a process 650 for detecting spam using a signature of an email message. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both. In one embodiment, processing logic resides at a server 104 of FIG. 1.
  • Referring to FIG. 6B, process 650 compares data in a signature of an incoming email message with data in a signature of each spam message. The signature data includes a parameter characterizing the content of an email message and a subset of hash values generated for the tokens contained in the email message. The parameter may specify, for example, the size of the email message, the number of tokens in the email message, the keyword associated with the email message, the name of an attachment file, etc.
  • Processing logic begins with comparing a parameter in a signature of the incoming email message with a corresponding parameter in a signature of each spam message (processing block 652).
  • A decision box 654, processing logic determines whether any spam message signatures contain a parameter similar to the parameter of the incoming message signature. The similarity may be determined, for example, based on the allowed difference between the two parameters or the allowed ratio of the two parameters.
  • If none of the spam message signatures contain a parameter similar to the parameter of the incoming message signature, processing logic decides that the incoming email message is legitimate (i.e., it is not spam) (processing block 662).
  • Alternatively, if one or more spam message signatures have a similar parameter, processing logic determines whether the signature of he first spam message has hash values similar to the hash values in the signature of the incoming email (decision box 656). Based on the similarity threshold, the hash values may be considered similar if, for example, a certain number of them matches or the ratio of matched and unmatched hash values exceeds a specified threshold.
  • If the first spam message signature has hash values similar to the hash values of the incoming email signature, processing logic decides that the incoming email message is spam (processing block 670). Otherwise, processing logic further determines if there are more spam message signatures with the similar parameter (decision box 658). If so, processing logic determines whether the next spam message signature has hash values similar to the hash values of the incoming email signature (decision box 656). If so, processing logic decides that the incoming email message is spam (processing block 670). If not, processing logic returns to processing block 658.
  • If processing logic determines that no other spam message signatures have the similar parameter, then it decides that the incoming mail message is not spam (processing block 662).
  • Character-Based Document Comparison Mechanism
  • FIG. 7 is a flow diagram of one embodiment of a process 700 for a character-based comparison of documents. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • Referring to FIG. 7, process 700 begins with processing logic pre-processing a document (processing block 702). In one embodiment, the document is pre-processed by changing each upper case alphabetic character within the document to a lower case alphabetic character. For example, the message “I am Sam, Sam I am.” may be pre-processed into an expression “i.am.sam.sam.i.am”.
  • At processing block 704, processing logic divides the document into tokens, with each token including a predefined number of sequential characters from the document. In one embodiment, each token is combined with its occurrence number. This combination is referred to as a labeled shingle. For example, if the predefined number of sequential characters in the token is equal to 3, the expression specified above includes the following set of labeled shingles:
      • i.a1
      • .am1
      • am.1
      • m.s1
      • .sa1
      • sam1
      • sm.2
      • m.s1
      • .sm2
      • sam2
      • am.3
      • m.i1
      • .i.1
      • i.a2
      • .am4
  • In one embodiment, the shingles are represented as a histogram.
  • At processing block 706, processing logic calculates hash values for the tokens. In one embodiment, the hash values are calculated for the labeled shingles. For example, if a hashing function H(x) is applied to each labeled shingle illustrated above, the following results are produced:
      • H(i.a1)->458348732
      • H(.am1)->200404023
      • H(am.1)->692939349
      • H(m.s1)->220443033
      • H(.sa1)->554034022
      • H(8am1)->542929292
      • H(am.2)->629292229
      • H(m.s1)->702202232
      • H(.sa2)->322243349
      • H(8 am2)->993923828
      • H(am.3)->163393269
      • H(m.i1)->595437753
      • H(.i.1)->843438583
      • H(i.a2)->244485639
      • H(.am4)->493869359
  • In one embodiment, processing logic then sorts the hash values as follows:
      • 163393269
      • 200604023
      • 220643033
      • 246685639
      • 322263369
      • 458368732
      • 493869359
      • 542929292
      • 554034022
      • 595637753
      • 629292229
      • 692939349
      • 702202232
      • 843438583
      • 933923828
  • At processing block 708, processing logic selects a subset of hash values from the calculated hash values. In one embodiment, processing logic selects X smallest values from the sorted hash values and creates from them a “sketch” of the document. For example, for X=4, the sketch can be expressed as follows:
      • [163393269 200404023 220443033 244485639]
  • At processing block 710, processing logic creates a signature of the document by adding to the sketch a parameter pertaining to the tokens of the document. In one embodiment, the parameter specifies the number of original tokens in the document. In the example above, the number of original tokens is 15. Hence, the signature of the document can be expressed as follows:
      • [15 163393269 200404023 220443033 244485639].
        Alternatively, the parameter may specify any other characteristic of the content of the document (e.g., the size of the document, the keyword associated with the document, etc.).
  • FIG. 8 is a flow diagram of one embodiment of a process 800 for determining whether two documents are similar. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • Referring to FIG. 8, process 800 begins with processing logic comparing the token numbers specified in the signatures of documents 1 and 2, and determining whether the token number in the first signature is within the allowed range with respect to the token number from the second signature (decision box 802). For example, the allowed range may be a difference of 1 or less or a ratio of 90 percent or higher.
  • If the token number in the first signature is outside of the allowed range with respect to the token number from the second signature, processing logic decides that documents 1 and 2 are different (processing block 808). Otherwise, if the token number in the first signature is within the allowed range with respect to the token number from the second signature, processing logic determines whether the resemblance between hash values in signatures 1 and 2 exceeds a threshold (e.g., more than 95 percent of hash values are the same) (decision box 804). If so, processing logic decides that the two documents are similar (processing block 806). If not, processing logic decides that documents 1 and 2 are different (processing block 808).
  • Email Spam Filtering Using Noise Reduction
  • FIG. 9 is a flow diagram of one embodiment of a process 900 for reducing noise in an email message. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • Referring to FIG. 9, process 900 begins with processing logic detecting in an email message data indicative of noise (processing block 902). As discussed above, noise represents data that is invisible to a recipient of the mail message and was added to the email message to avoid spam filtering. Such data may include, for example, formatting data (e.g., HTML tags), numeric character references, character entity references, URL data of predefined categories, etc. Numeric character references specify the code position of a character in the document character set. Character entity references use symbolic names so that authors need not remember code positions. For example, the character entity reference &aring refers to the lowercase “a” character topped with a ring.
  • At processing block 904, processing logic modifies the content of the email message to reduce the noise. In one embodiment, the content modification includes removing formatting data, translating numeric character references and charcater entity references to their ASCII equivalents, and modifying URL data.
  • At processing block 906, processing logic compares the modified content of the email message with the content of a spam message. In one embodiment, the comparison is performed to identify an exact match. Alternatively, the comparison is performed to determine whether the two documents are similar.
  • FIG. 10 is a flow diagram of one embodiment of a process 1000 for modifying an email message to reduce noise. The process may be performed by processing logic that may comprise hardware (e.g., dedicated logic, programmable logic, microcode, etc.), software (such as run on a general purpose computer system or a dedicated machine), or a combination of both.
  • Referring to FIG. 10, process 1000 begins with processing logic searching an email message for formatting data (e.g., HTML tags) (processing block 1002).
  • At decision box 1004, processing logic determines whether the found formatting data qualifies as an exception. Typically, HTML formatting does not add anything to the information content of a message. However, a few exceptions exist. These exceptions are the tags that contain useful information for further processing of the message (e.g., tags <BODY>, <A>, <IMG>, and <FONT>). For example, the <FONT> and <BODY> tags are needed for “white on white” text elimination, and the <A> and <IMG> tags typically contain link information that may be used for passing data to other components of the system.
  • If the formatting data does not qualify as an exception, the formatting data is extracted from the email message (processing block 1006).
  • Next, processing logic converts each numerical character reference and character entity reference into a corresponding ASCII character (processing block 1008).
  • In HTML, numeric character references may take two forms:
      • 1. The syntax “&#D;”, where D is a decimal number, refers to the ISO 10646 decimal character number D; and
      • 2. The syntax “&#xH;” or “&#XH;”, where H is a hexadecimal number, refers to the ISO 10646 hexadecimal character number H. Hexadecimal numbers in numeric character references are case-insensitive.
        For example, randomized characters in the body may appear as a following expression:
        Th&#101&#32&#83a&#118&#105n&#103&#115R&#101&#103 is &#116e&#114&#119&#97&#110&#116&#115&#32yo&#117.
        This expression has a meaning of the phrase “The SavingsRegister wants you.”
  • Some times the conversion performed at processing block 1008 may need to be repeated. For example, the string “&#38;” corresponds to the string “&” in ASCII, the string “&#35;” corresponds to the string “#” in ASCII, the string “&#51;” corresponds to 3 in ASCII, the string “#56;” corresponds to 8 in ASCII, and “#59;” corresponds to the string “;” in ASCII. Hence, the combined string “&#38;&#35;&#51;&#56;&#59;”, when converted, results in the string “&#38;” that needs to be converted.
  • Accordingly, after the first conversion operation at processing block 1008, processing logic checks whether the converted data still includes numeric character references or character entity references (decision box 1010). If the check is positive, processing logic repeats the conversion operation at processing block 1008. Otherwise, processing logic proceeds to processing block 1012.
  • At processing block 1012, processing logic modifies URL data of predefined categories. These categories may include, for example, numerical character references contained in the URL that are converted by processing logic into corresponding ASCII characters. In addition, the URL “password” syntax may be used to add characters before an “@” in the URL hostname. These characters are ignored by the target web server but they add significant amounts of noise information to each URL. Processing logic modifies the URL data by removing these additional characters. Finally, processing logic removes the “query” part of the URL, following a string “?” at the end of the URL.
  • An example of a URL is as follows:
  • http %3a %2f %2flotsofjunk@www.foo.com %2fbar.html?muchmorejunk
  • Processing logic modifies the above URL data into http://www.foo.coni/bar.html.
  • An Exemplary Computer System
  • FIG. 11 is a block diagram of an exemplary computer system 1100 that may be used to perform one or more of the operations described herein. In alternative embodiments, the machine may comprise a network router, a network switch, a network bridge, Personal Digital Assistant (PDA), a cellular telephone, a web appliance or any machine capable of executing a sequence of instructions that specify actions to be taken by that machine.
  • The computer system 1100 includes a processor 1102, a main memory 1104 and a static memory 1106, which communicate with each other via a bus 1108. The computer system 1100 may further include a video display unit 1110 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1100 also includes an alpha-numeric input device 1112 (e.g., a keyboard), a cursor control device 1114 (e.g., a mouse), a disk drive unit 1116, a signal generation device 1120 (e.g., a speaker) and a network interface device 1122.
  • The disk drive unit 1116 includes a computer-readable medium 1124 on which is stored a set of instructions (i.e., software) 1126 embodying any one, or all, of the methodologies described above. The software 1126 is also shown to reside, completely or at least partially, within the main memory 1104 and/or within the processor 1102. The software 1126 may further be transmitted or received via the network interface device 1122. For the purposes of this specification, the term “computer-readable medium” shall be taken to include any medium that is capable of storing or encoding a sequence of instructions for execution by the computer and that cause the computer to perform any one of the methodologies of the present invention. The term “computer-readable medium” shall accordingly be taken to included, but not be limited to, solid-state memories, optical and magnetic disks, and carrier wave signals.
  • Whereas many alterations and modifications of the present invention will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that any particular embodiment shown and described by way of illustration is in no way intended to be considered limiting. Therefore, references to details of various embodiments are not intended to limit the scope of the claims which in themselves recite only those features regarded as essential to the invention.

Claims (27)

1. A method comprising:
dividing a first document into a plurality of tokens, each token including a predefined number of sequential characters from the first document;
calculating a plurality of hash values for the plurality of tokens; and
creating, for the first document, a signature including a subset of hash values from the plurality of hash values and additional information pertaining to the plurality of tokens of the first document, the signature of the first document being subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
2. The method of claim 1 further comprising:
prior to dividing the first document into the plurality of tokens, changing each upper case alphabetic character within the first document to a lower case alphabetic character, and changing each non-alphabetic character within the first document to a single predefined non-alphabetic character.
3. The method of claim 1 wherein the first document is a first email message and the second document is a second email message.
4. The method of claim 1 wherein calculating the plurality of hash values for the plurality of tokens comprises:
creating a shingle for each of the plurality of tokens by combining said each of the plurality of tokens with a number of occurrences of said each of the plurality of tokens within the first document; and
applying a hashing function to each created shingle.
5. The method of claim 4 wherein shingles created for the plurality of tokens are represented as a histogram.
6. The method of claim 1 wherein the predefined number of sequential characters in each token is equal to three.
7. The method of claim 1 wherein the additional information pertaining to the plurality of tokens comprises a number of the plurality of tokens contained in the first document.
8. The method of claim 1 wherein creating the signature for the first document comprises:
sorting the plurality of hash values; and
selecting a predefined number of smallest hash values from the sorted plurality of hash values.
9. The method of claim 7 further comprising:
determining whether the number of the plurality of tokens contained in the first document is within an allowed range with respect to a number of a plurality of tokens contained in the second document; and
if the number of the plurality of tokens contained in the first document is not within the defined range from the number of the plurality of tokens contained in the second document, deciding that the first document does not resemble the second document.
10. The method of claim 9 further comprising:
determining that the number of the plurality of tokens contained in the first document is within the defined range from the number of the plurality of tokens contained in the second document;
determining whether the subset of hash values contained in the signature of the first document is similar to a subset of hash values contained in the signature of the second document; and
if the subset of hash values contained in the signature of the first document is similar to the subset of hash values contained in the signature of the second document, deciding that the first document resembles the second document.
11. The method of claim 10 wherein the second email message is a spam email message.
12. The method of claim 11 further comprising:
marking the first email message as spam upon deciding that the first document resembles the second document.
13. A system comprising:
a parser to divide a first document into a plurality of tokens, each token including a predefined number of sequential characters from the first document; and
a message data generator to calculate a plurality of hash values for the plurality of tokens, and to create, for the first document, a signature including a subset of hash values from the plurality of hash values and additional information pertaining to the plurality of tokens of the first document, the signature of the first document being subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
14. The system of claim 13 wherein the message data generator is further to change each upper case alphabetic character within the first document to a lower case alphabetic character, and to change each non-alphabetic character within the first document to a single predefined non-alphabetic character.
15. The system of claim 13 wherein the first document is a first email message and the second document is a second email message.
16. The system of claim 13 wherein the message data generator is to calculate the plurality of hash values for the plurality of tokens by creating a shingle for each of the plurality of tokens by combining said each of the plurality of tokens with a number of occurrences of said each of the plurality of tokens within the first document, and applying a hashing function to each created shingle.
17. The system of claim 13 wherein the predefined number of sequential characters in each token is equal to three.
18. The system of claim 13 wherein the additional information pertaining to the plurality of tokens comprises a number of the plurality of tokens contained in the first document.
19. The system of claim 13 wherein the message data generator is to create the signature for the first document by sorting the plurality of hash values, and selecting a predefined number of smallest hash values from the sorted plurality of hash values.
20. The system of claim 18 further comprising a resemblance identifier to determine whether the number of the plurality of tokens contained in the first document is within an allowed range with respect to a number of a plurality of tokens contained in the second document, and, if the number of the plurality of tokens contained in the first document is not within the defined range from the number of the plurality of tokens contained in the second document, to decide that the first document does not resemble the second document.
21. The system of claim 20 wherein the resemblance identifier to determine that the number of the plurality of tokens contained in the first document is within the defined range from the number of the plurality of tokens contained in the second document, to determine whether the subset of hash values contained in the signature of the first document is similar to a subset of hash values contained in the signature of the second document, and, if the subset of hash values contained in the signature of the first document is similar to the subset of hash values contained in the signature of the second document, to decide that the first document resembles the second document.
22. An apparatus comprising:
means for dividing a first document into a plurality of tokens, each token including a predefined number of sequential characters from the first document;
means for calculating a plurality of hash values for the plurality of tokens; and
means for creating, for the first document, a signature including a subset of hash values from the plurality of hash values and additional information pertaining to the plurality of tokens of the first document, the signature of the first document being subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
23. The apparatus of claim 22 wherein the predefined number of sequential characters in each token is equal to three.
24. The apparatus of claim 22 wherein the additional information pertaining to the plurality of tokens comprises a number of the plurality of tokens contained in the first document.
25. A computer readable medium comprising executable instructions which when executed on a processing system cause said processing system to perform a method comprising:
dividing a first document into a plurality of tokens, each token including a predefined number of sequential characters from the first document;
calculating a plurality of hash values for the plurality of tokens; and
creating, for the first document, a signature including a subset of hash values from the plurality of hash values and additional information pertaining to the plurality of tokens of the first document, the signature of the first document being subsequently compared with a signature of a second document to determine resemblance between the first document and the second document.
26. The computer readable medium of claim 25 wherein the predefined number of sequential characters in each token is equal to three.
27. The computer readable medium of claim 25 wherein the additional information pertaining to the plurality of tokens comprises a number of the plurality of tokens contained in the first document.
US10/845,648 2003-05-15 2004-05-13 Method and apparatus for a character-based comparison of documents Abandoned US20050132197A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/845,648 US20050132197A1 (en) 2003-05-15 2004-05-13 Method and apparatus for a character-based comparison of documents

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US47124203P 2003-05-15 2003-05-15
US10/845,648 US20050132197A1 (en) 2003-05-15 2004-05-13 Method and apparatus for a character-based comparison of documents

Publications (1)

Publication Number Publication Date
US20050132197A1 true US20050132197A1 (en) 2005-06-16

Family

ID=33476818

Family Applications (4)

Application Number Title Priority Date Filing Date
US10/845,648 Abandoned US20050132197A1 (en) 2003-05-15 2004-05-13 Method and apparatus for a character-based comparison of documents
US10/845,819 Expired - Fee Related US7831667B2 (en) 2003-05-15 2004-05-13 Method and apparatus for filtering email spam using email noise reduction
US10/846,723 Abandoned US20050108340A1 (en) 2003-05-15 2004-05-13 Method and apparatus for filtering email spam based on similarity measures
US12/941,939 Expired - Fee Related US8402102B2 (en) 2003-05-15 2010-11-08 Method and apparatus for filtering email spam using email noise reduction

Family Applications After (3)

Application Number Title Priority Date Filing Date
US10/845,819 Expired - Fee Related US7831667B2 (en) 2003-05-15 2004-05-13 Method and apparatus for filtering email spam using email noise reduction
US10/846,723 Abandoned US20050108340A1 (en) 2003-05-15 2004-05-13 Method and apparatus for filtering email spam based on similarity measures
US12/941,939 Expired - Fee Related US8402102B2 (en) 2003-05-15 2010-11-08 Method and apparatus for filtering email spam using email noise reduction

Country Status (5)

Country Link
US (4) US20050132197A1 (en)
EP (1) EP1649645A2 (en)
JP (1) JP4598774B2 (en)
TW (1) TWI348851B (en)
WO (1) WO2004105332A2 (en)

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050127171A1 (en) * 2003-12-10 2005-06-16 Ahuja Ratinder Paul S. Document registration
US20050132046A1 (en) * 2003-12-10 2005-06-16 De La Iglesia Erik Method and apparatus for data capture and analysis system
US20050132079A1 (en) * 2003-12-10 2005-06-16 Iglesia Erik D.L. Tag data structure for maintaining relational data over captured objects
US20050131876A1 (en) * 2003-12-10 2005-06-16 Ahuja Ratinder Paul S. Graphical user interface for capture system
US20050166066A1 (en) * 2004-01-22 2005-07-28 Ratinder Paul Singh Ahuja Cryptographic policy enforcement
US20050177725A1 (en) * 2003-12-10 2005-08-11 Rick Lowe Verifying captured objects before presentation
US20050204005A1 (en) * 2004-03-12 2005-09-15 Purcell Sean E. Selective treatment of messages based on junk rating
US20050273614A1 (en) * 2004-06-07 2005-12-08 Ahuja Ratinder P S Generating signatures over a document
US20060036693A1 (en) * 2004-08-12 2006-02-16 Microsoft Corporation Spam filtering with probabilistic secure hashes
US20060047675A1 (en) * 2004-08-24 2006-03-02 Rick Lowe File system for a capture system
US20060112120A1 (en) * 2004-11-22 2006-05-25 International Business Machines Corporation Method, system, and computer program product for threading documents using body text analysis
US20060136730A1 (en) * 2004-12-22 2006-06-22 Charles Milligan Method and system for establishing trusting environment for sharing data between mutually mistrusting entities
US20060262867A1 (en) * 2005-05-17 2006-11-23 Ntt Docomo, Inc. Data communications system and data communications method
US20070036156A1 (en) * 2005-08-12 2007-02-15 Weimin Liu High speed packet capture
US20070050334A1 (en) * 2005-08-31 2007-03-01 William Deninger Word indexing in a capture system
US20070116366A1 (en) * 2005-11-21 2007-05-24 William Deninger Identifying image type in a capture system
EP1837784A1 (en) * 2006-03-23 2007-09-26 Canon Kabushiki Kaisha Document management apparatus, document management system, control method of the apparatus and system, program, and storage medium
US20080028468A1 (en) * 2006-07-28 2008-01-31 Sungwon Yi Method and apparatus for automatically generating signatures in network security systems
EP1997281A1 (en) * 2006-03-09 2008-12-03 Borderware Technologies Inc. Method and sytem for recognizing desired email
US20090089539A1 (en) * 2007-09-30 2009-04-02 Guy Barry Owen Bunker System and method for detecting email content containment
US20090089384A1 (en) * 2007-09-30 2009-04-02 Tsuen Wan Ngan System and method for detecting content similarity within email documents by sparse subset hashing
US20090089383A1 (en) * 2007-09-30 2009-04-02 Tsuen Wan Ngan System and method for detecting content similarity within emails documents employing selective truncation
US20090089266A1 (en) * 2007-09-27 2009-04-02 Microsoft Corporation Method of finding candidate sub-queries from longer queries
US20090193018A1 (en) * 2005-05-09 2009-07-30 Liwei Ren Matching Engine With Signature Generation
US7689614B2 (en) 2006-05-22 2010-03-30 Mcafee, Inc. Query generation for a capture system
US20100095378A1 (en) * 2003-09-08 2010-04-15 Jonathan Oliver Classifying a Message Based on Fraud Indicators
US7730011B1 (en) 2005-10-19 2010-06-01 Mcafee, Inc. Attributes of captured objects in a capture system
US7739337B1 (en) * 2005-06-20 2010-06-15 Symantec Corporation Method and apparatus for grouping spam email messages
US20100246547A1 (en) * 2009-03-26 2010-09-30 Samsung Electronics Co., Ltd. Antenna selecting apparatus and method in wireless communication system
CN101853260A (en) * 2009-04-01 2010-10-06 赛门铁克公司 System and method for detecting e-mail content
US20110055343A1 (en) * 2003-05-15 2011-03-03 Symantec Corporation Method and apparatus for filtering email spam using email noise reduction
US7958227B2 (en) 2006-05-22 2011-06-07 Mcafee, Inc. Attributes of captured objects in a capture system
US7962591B2 (en) 2004-06-23 2011-06-14 Mcafee, Inc. Object classification in a capture system
US8010689B2 (en) 2006-05-22 2011-08-30 Mcafee, Inc. Locational tagging in a capture system
US8205242B2 (en) 2008-07-10 2012-06-19 Mcafee, Inc. System and method for data mining and security policy management
US8447722B1 (en) 2009-03-25 2013-05-21 Mcafee, Inc. System and method for data mining and security policy management
US8458268B1 (en) * 2010-02-22 2013-06-04 Symantec Corporation Systems and methods for distributing spam signatures
US8473442B1 (en) 2009-02-25 2013-06-25 Mcafee, Inc. System and method for intelligent state management
US8504537B2 (en) 2006-03-24 2013-08-06 Mcafee, Inc. Signature distribution in a document registration system
US8548170B2 (en) 2003-12-10 2013-10-01 Mcafee, Inc. Document de-registration
US8560534B2 (en) 2004-08-23 2013-10-15 Mcafee, Inc. Database for a capture system
US8656039B2 (en) 2003-12-10 2014-02-18 Mcafee, Inc. Rule parser
US8667121B2 (en) 2009-03-25 2014-03-04 Mcafee, Inc. System and method for managing data and policies
US8700561B2 (en) 2011-12-27 2014-04-15 Mcafee, Inc. System and method for providing data protection workflows in a network environment
US8706709B2 (en) 2009-01-15 2014-04-22 Mcafee, Inc. System and method for intelligent term grouping
US8806615B2 (en) 2010-11-04 2014-08-12 Mcafee, Inc. System and method for protecting specified data combinations
US20140259157A1 (en) * 2013-03-08 2014-09-11 Bitdefender IPR Management Ltd. Document Classification Using Multiscale Text Fingerprints
US8850591B2 (en) 2009-01-13 2014-09-30 Mcafee, Inc. System and method for concept building
US20140334739A1 (en) * 2013-05-08 2014-11-13 Xyratex Technology Limited Methods of clustering computational event logs
US8954458B2 (en) 2011-07-11 2015-02-10 Aol Inc. Systems and methods for providing a content item database and identifying content items
US20150339583A1 (en) * 2014-05-20 2015-11-26 Aol Inc. Machine learning and validation of account names, addresses, and/or identifiers
US9253154B2 (en) 2008-08-12 2016-02-02 Mcafee, Inc. Configuration management for a capture/registration system
US9407463B2 (en) * 2011-07-11 2016-08-02 Aol Inc. Systems and methods for providing a spam database and identifying spam communications
WO2017116741A1 (en) * 2015-12-31 2017-07-06 Taser International, Inc. Systems and methods for filtering messages
US10049208B2 (en) * 2015-12-03 2018-08-14 Bank Of America Corporation Intrusion assessment system
US10657267B2 (en) * 2014-12-05 2020-05-19 GeoLang Ltd. Symbol string matching mechanism
EP4020254A1 (en) * 2020-12-23 2022-06-29 Cylance Inc. Statistical data fingerprinting and tracing data similarity of documents

Families Citing this family (90)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7574495B1 (en) 2000-09-13 2009-08-11 Fortinet, Inc. System and method for managing interworking communications protocols
US7444398B1 (en) * 2000-09-13 2008-10-28 Fortinet, Inc. System and method for delivering security services
US7389358B1 (en) * 2000-09-13 2008-06-17 Fortinet, Inc. Distributed virtual system to support managed, network-based services
US7272643B1 (en) 2000-09-13 2007-09-18 Fortinet, Inc. System and method for managing and provisioning virtual routers
US7487232B1 (en) 2000-09-13 2009-02-03 Fortinet, Inc. Switch management system and method
US8056131B2 (en) * 2001-06-21 2011-11-08 Cybersoft, Inc. Apparatus, methods and articles of manufacture for intercepting, examining and controlling code, data and files and their transfer
US7181547B1 (en) 2001-06-28 2007-02-20 Fortinet, Inc. Identifying nodes in a ring network
US7161904B2 (en) 2002-06-04 2007-01-09 Fortinet, Inc. System and method for hierarchical metering in a virtual router based network switch
US7376125B1 (en) * 2002-06-04 2008-05-20 Fortinet, Inc. Service processing switch
US7116665B2 (en) * 2002-06-04 2006-10-03 Fortinet, Inc. Methods and systems for a distributed provider edge
US7177311B1 (en) * 2002-06-04 2007-02-13 Fortinet, Inc. System and method for routing traffic through a virtual router-based network switch
US7203192B2 (en) 2002-06-04 2007-04-10 Fortinet, Inc. Network packet steering
US8046832B2 (en) * 2002-06-26 2011-10-25 Microsoft Corporation Spam detector with challenges
US7096383B2 (en) 2002-08-29 2006-08-22 Cosine Communications, Inc. System and method for virtual router failover in a network routing system
EP1551424B1 (en) 2002-09-04 2015-07-01 Arrowhead Research Corporation Treatment of chronic neuropathic pain by admistration of dsrna
US7266120B2 (en) * 2002-11-18 2007-09-04 Fortinet, Inc. System and method for hardware accelerated packet multicast in a virtual routing system
EP1586054A4 (en) * 2002-12-13 2010-12-08 Symantec Corp Method, system, and computer program product for security within a global computer network
US8145710B2 (en) 2003-06-18 2012-03-27 Symantec Corporation System and method for filtering spam messages utilizing URL filtering module
US7711779B2 (en) 2003-06-20 2010-05-04 Microsoft Corporation Prevention of outgoing spam
US8271588B1 (en) 2003-09-24 2012-09-18 Symantec Corporation System and method for filtering fraudulent email messages
JP4297345B2 (en) * 2004-01-14 2009-07-15 Kddi株式会社 Mass mail detection method and mail server
US8301702B2 (en) * 2004-01-20 2012-10-30 Cloudmark, Inc. Method and an apparatus to screen electronic communications
US8886727B1 (en) 2004-01-27 2014-11-11 Sonicwall, Inc. Message distribution control
US9471712B2 (en) * 2004-02-09 2016-10-18 Dell Software Inc. Approximate matching of strings for message filtering
US8171549B2 (en) * 2004-04-26 2012-05-01 Cybersoft, Inc. Apparatus, methods and articles of manufacture for intercepting, examining and controlling code, data, files and their transfer
US7941490B1 (en) * 2004-05-11 2011-05-10 Symantec Corporation Method and apparatus for detecting spam in email messages and email attachments
US7912905B2 (en) * 2004-05-18 2011-03-22 Computer Associates Think, Inc. System and method for filtering network messages
US7343624B1 (en) * 2004-07-13 2008-03-11 Sonicwall, Inc. Managing infectious messages as identified by an attachment
US9154511B1 (en) 2004-07-13 2015-10-06 Dell Software Inc. Time zero detection of infectious messages
US7523098B2 (en) 2004-09-15 2009-04-21 International Business Machines Corporation Systems and methods for efficient data searching, storage and reduction
US8725705B2 (en) * 2004-09-15 2014-05-13 International Business Machines Corporation Systems and methods for searching of storage data with reduced bandwidth requirements
US7499419B2 (en) * 2004-09-24 2009-03-03 Fortinet, Inc. Scalable IP-services enabled multicast forwarding with efficient resource utilization
US8495144B1 (en) * 2004-10-06 2013-07-23 Trend Micro Incorporated Techniques for identifying spam e-mail
US20060095966A1 (en) * 2004-11-03 2006-05-04 Shawn Park Method of detecting, comparing, blocking, and eliminating spam emails
US7808904B2 (en) * 2004-11-18 2010-10-05 Fortinet, Inc. Method and apparatus for managing subscriber profiles
US20060149820A1 (en) * 2005-01-04 2006-07-06 International Business Machines Corporation Detecting spam e-mail using similarity calculations
CA2493442C (en) 2005-01-20 2014-12-16 Certicom Corp. Method and system of managing and filtering electronic messages using cryptographic techniques
EP1869858A2 (en) * 2005-04-13 2007-12-26 France Telecom Method for controlling the sending of unsolicited voice information
GB2425855A (en) * 2005-04-25 2006-11-08 Messagelabs Ltd Detecting and filtering of spam emails
US8135778B1 (en) 2005-04-27 2012-03-13 Symantec Corporation Method and apparatus for certifying mass emailings
WO2006126202A2 (en) * 2005-05-26 2006-11-30 Kayote Networks, Inc. Detection of spit in voip calls
US8010609B2 (en) 2005-06-20 2011-08-30 Symantec Corporation Method and apparatus for maintaining reputation lists of IP addresses to detect email spam
US7930353B2 (en) * 2005-07-29 2011-04-19 Microsoft Corporation Trees of classifiers for detecting email spam
US8065370B2 (en) 2005-11-03 2011-11-22 Microsoft Corporation Proofs to filter spam
CN1987909B (en) * 2005-12-22 2012-08-15 腾讯科技(深圳)有限公司 Method, System and device for purifying Bayes spam
US20070180031A1 (en) * 2006-01-30 2007-08-02 Microsoft Corporation Email Opt-out Enforcement
US7760684B2 (en) 2006-02-13 2010-07-20 Airwide Solutions, Inc. Measuring media distribution and impact in a mobile communication network
US7748022B1 (en) * 2006-02-21 2010-06-29 L-3 Communications Sonoma Eo, Inc. Real-time data characterization with token generation for fast data retrieval
US7668920B2 (en) * 2006-03-01 2010-02-23 Fortinet, Inc. Electronic message and data tracking system
US8028335B2 (en) 2006-06-19 2011-09-27 Microsoft Corporation Protected environments for protecting users against undesirable activities
US7730316B1 (en) * 2006-09-22 2010-06-01 Fatlens, Inc. Method for document fingerprinting
CN101166159B (en) * 2006-10-18 2010-07-28 阿里巴巴集团控股有限公司 A method and system for identifying rubbish information
US8577968B2 (en) * 2006-11-14 2013-11-05 Mcafee, Inc. Method and system for handling unwanted email messages
US8224905B2 (en) 2006-12-06 2012-07-17 Microsoft Corporation Spam filtration utilizing sender activity data
FI20075547L (en) * 2007-07-17 2009-01-18 First Hop Oy Delivery of advertisements in the mobile advertising system
FI123303B (en) * 2007-07-17 2013-02-15 Airwide Solutions Oy Content tracking
US20090300012A1 (en) * 2008-05-28 2009-12-03 Barracuda Inc. Multilevel intent analysis method for email filtration
CN101594312B (en) * 2008-05-30 2012-12-26 电子科技大学 Method for recognizing junk mail based on artificial immunity and behavior characteristics
US20090319506A1 (en) * 2008-06-19 2009-12-24 Tsuen Wan Ngan System and method for efficiently finding email similarity in an email repository
US8028031B2 (en) * 2008-06-27 2011-09-27 Microsoft Corporation Determining email filtering type based on sender classification
US8099498B2 (en) * 2008-09-03 2012-01-17 Microsoft Corporation Probabilistic mesh routing
US8473455B2 (en) 2008-09-03 2013-06-25 Microsoft Corporation Query-oriented message characterization
US9704177B2 (en) * 2008-12-23 2017-07-11 International Business Machines Corporation Identifying spam avatars in a virtual universe (VU) based upon turing tests
US9697535B2 (en) * 2008-12-23 2017-07-04 International Business Machines Corporation System and method in a virtual universe for identifying spam avatars based upon avatar multimedia characteristics
US8656476B2 (en) 2009-05-28 2014-02-18 International Business Machines Corporation Providing notification of spam avatars
US8925087B1 (en) * 2009-06-19 2014-12-30 Trend Micro Incorporated Apparatus and methods for in-the-cloud identification of spam and/or malware
US20110015939A1 (en) * 2009-07-17 2011-01-20 Marcos Lara Gonzalez Systems and methods to create log entries and share a patient log using open-ended electronic messaging and artificial intelligence
US8874663B2 (en) * 2009-08-28 2014-10-28 Facebook, Inc. Comparing similarity between documents for filtering unwanted documents
US8316094B1 (en) * 2010-01-21 2012-11-20 Symantec Corporation Systems and methods for identifying spam mailing lists
US9450781B2 (en) * 2010-12-09 2016-09-20 Alcatel Lucent Spam reporting and management in a communication network
US9384471B2 (en) * 2011-02-22 2016-07-05 Alcatel Lucent Spam reporting and management in a communication network
CN102655480B (en) * 2011-03-03 2015-12-02 腾讯科技(深圳)有限公司 Similar mail treatment system and method
US8819156B2 (en) 2011-03-11 2014-08-26 James Robert Miner Systems and methods for message collection
US9419928B2 (en) 2011-03-11 2016-08-16 James Robert Miner Systems and methods for message collection
US9559868B2 (en) 2011-04-01 2017-01-31 Onavo Mobile Ltd. Apparatus and methods for bandwidth saving and on-demand data delivery for a mobile device
US9116879B2 (en) 2011-05-25 2015-08-25 Microsoft Technology Licensing, Llc Dynamic rule reordering for message classification
US8700913B1 (en) 2011-09-23 2014-04-15 Trend Micro Incorporated Detection of fake antivirus in computers
US8612436B1 (en) 2011-09-27 2013-12-17 Google Inc. Reverse engineering circumvention of spam detection algorithms
RU2013144681A (en) 2013-10-03 2015-04-10 Общество С Ограниченной Ответственностью "Яндекс" ELECTRONIC MESSAGE PROCESSING SYSTEM FOR DETERMINING ITS CLASSIFICATION
US20150295869A1 (en) * 2014-04-14 2015-10-15 Microsoft Corporation Filtering Electronic Messages
JP6356273B2 (en) 2014-06-26 2018-07-11 グーグル エルエルシー Batch optimized rendering and fetch architecture
RU2665920C2 (en) 2014-06-26 2018-09-04 Гугл Инк. Optimized visualization process in browser
EP3161610B1 (en) 2014-06-26 2020-08-05 Google LLC Optimized browser rendering process
US10229219B2 (en) * 2015-05-01 2019-03-12 Facebook, Inc. Systems and methods for demotion of content items in a feed
TWI569608B (en) * 2015-10-08 2017-02-01 網擎資訊軟體股份有限公司 A computer program product and e-mail transmission method thereof for e-mail transmission in monitored network environment
CN106372202B (en) * 2016-08-31 2020-04-17 北京奇艺世纪科技有限公司 Text similarity calculation method and device
US10657182B2 (en) 2016-09-20 2020-05-19 International Business Machines Corporation Similar email spam detection
JP6533823B2 (en) * 2017-05-08 2019-06-19 デジタルア−ツ株式会社 INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, PROGRAM, RECORDING MEDIUM, AND INFORMATION PROCESSING METHOD
US11249965B2 (en) * 2018-05-24 2022-02-15 Paypal, Inc. Efficient random string processing
CN112154422A (en) * 2018-06-01 2020-12-29 三菱电机株式会社 Suspicious mail detection device, suspicious mail detection method, and suspicious mail detection program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926812A (en) * 1996-06-20 1999-07-20 Mantra Technologies, Inc. Document extraction and comparison method with applications to automatic personalized database searching
US6199103B1 (en) * 1997-06-24 2001-03-06 Omron Corporation Electronic mail determination method and system and storage medium
US6460050B1 (en) * 1999-12-22 2002-10-01 Mark Raymond Pace Distributed content identification system
US20030195937A1 (en) * 2002-04-16 2003-10-16 Kontact Software Inc. Intelligent message screening
US6804667B1 (en) * 1999-11-30 2004-10-12 Ncr Corporation Filter for checking for duplicate entries in database
US7080123B2 (en) * 2001-09-20 2006-07-18 Sun Microsystems, Inc. System and method for preventing unnecessary message duplication in electronic mail

Family Cites Families (83)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0240649A (en) 1988-07-30 1990-02-09 Konica Corp Silver halide color photographic sensitive material
CA1321656C (en) 1988-12-22 1993-08-24 Chander Kasiraj Method for restricting delivery and receipt of electronic message
GB8918553D0 (en) * 1989-08-15 1989-09-27 Digital Equipment Int Message control system
JPH03117940A (en) 1989-09-25 1991-05-20 Internatl Business Mach Corp <Ibm> Method of managing electronic mail
US5822527A (en) * 1990-05-04 1998-10-13 Digital Equipment Corporation Method and apparatus for information stream filtration using tagged information access and action registration
GB2271002B (en) 1992-09-26 1995-12-06 Digital Equipment Int Data processing system
US5634005A (en) * 1992-11-09 1997-05-27 Kabushiki Kaisha Toshiba System for automatically sending mail message by storing rule according to the language specification of the message including processing condition and processing content
TW237588B (en) * 1993-06-07 1995-01-01 Microsoft Corp
JP2837815B2 (en) * 1994-02-03 1998-12-16 インターナショナル・ビジネス・マシーンズ・コーポレイション Interactive rule-based computer system
US5675507A (en) * 1995-04-28 1997-10-07 Bobo, Ii; Charles R. Message storage and delivery system
US5758257A (en) 1994-11-29 1998-05-26 Herz; Frederick System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US5619648A (en) 1994-11-30 1997-04-08 Lucent Technologies Inc. Message filtering techniques
GB2316588B (en) * 1995-05-08 2000-05-31 Compuserve Inc Rules based electronic message management system
US5696898A (en) * 1995-06-06 1997-12-09 Lucent Technologies Inc. System and method for database access control
US5678041A (en) * 1995-06-06 1997-10-14 At&T System and method for restricting user access rights on the internet based on rating information stored in a relational database
US5845263A (en) * 1995-06-16 1998-12-01 High Technology Solutions, Inc. Interactive visual ordering system
US5826269A (en) * 1995-06-21 1998-10-20 Microsoft Corporation Electronic mail interface for a network server
US5889943A (en) * 1995-09-26 1999-03-30 Trend Micro Incorporated Apparatus and method for electronic mail virus detection and elimination
US5862325A (en) 1996-02-29 1999-01-19 Intermind Corporation Computer-based communication system and method using metadata defining a control structure
US5870548A (en) * 1996-04-05 1999-02-09 Sun Microsystems, Inc. Method and apparatus for altering sent electronic mail messages
US5826022A (en) * 1996-04-05 1998-10-20 Sun Microsystems, Inc. Method and apparatus for receiving electronic mail
US5809242A (en) * 1996-04-19 1998-09-15 Juno Online Services, L.P. Electronic mail system for displaying advertisement at local computer received from remote system while the local computer is off-line the remote system
US5884033A (en) * 1996-05-15 1999-03-16 Spyglass, Inc. Internet filtering system for filtering data transferred over the internet utilizing immediate and deferred filtering actions
US5864684A (en) * 1996-05-22 1999-01-26 Sun Microsystems, Inc. Method and apparatus for managing subscriptions to distribution lists
WO1997046962A1 (en) * 1996-06-07 1997-12-11 At & T Corp. Finding an e-mail message to which another e-mail message is a response
US6453327B1 (en) * 1996-06-10 2002-09-17 Sun Microsystems, Inc. Method and apparatus for identifying and discarding junk electronic mail
US5909677A (en) * 1996-06-18 1999-06-01 Digital Equipment Corporation Method for determining the resemblance of documents
US5790789A (en) * 1996-08-02 1998-08-04 Suarez; Larry Method and architecture for the creation, control and deployment of services within a distributed computer environment
US5978837A (en) * 1996-09-27 1999-11-02 At&T Corp. Intelligent pager for remotely managing E-Mail messages
US5930479A (en) * 1996-10-21 1999-07-27 At&T Corp Communications addressing system
US5796948A (en) * 1996-11-12 1998-08-18 Cohen; Elliot D. Offensive message interceptor for computers
JPH10240649A (en) 1996-12-27 1998-09-11 Canon Inc Device and system for processing electronic mail
US6146026A (en) * 1996-12-27 2000-11-14 Canon Kabushiki Kaisha System and apparatus for selectively publishing electronic-mail
US5995597A (en) * 1997-01-21 1999-11-30 Woltz; Robert Thomas E-mail processing system and method
CA2282502A1 (en) 1997-02-25 1998-08-27 Intervoice Limited Partnership E-mail server for message filtering and routing
US6189026B1 (en) * 1997-06-16 2001-02-13 Digital Equipment Corporation Technique for dynamically generating an address book in a distributed electronic mail system
US6023700A (en) * 1997-06-17 2000-02-08 Cranberry Properties, Llc Electronic mail distribution system for integrated electronic communication
JP3148152B2 (en) * 1997-06-27 2001-03-19 日本電気株式会社 Delivery method of broadcast mail using electronic mail system
US7117358B2 (en) * 1997-07-24 2006-10-03 Tumbleweed Communications Corp. Method and system for filtering communication
US6073165A (en) * 1997-07-29 2000-06-06 Jfax Communications, Inc. Filtering computer network messages directed to a user's e-mail box based on user defined filters, and forwarding a filtered message to the user's receiver
US5999967A (en) * 1997-08-17 1999-12-07 Sundsted; Todd Electronic mail filtering by electronic stamp
US6199102B1 (en) * 1997-08-26 2001-03-06 Christopher Alan Cobb Method and system for filtering electronic messages
JP3439330B2 (en) * 1997-09-25 2003-08-25 日本電気株式会社 Email server
US6195686B1 (en) * 1997-09-29 2001-02-27 Ericsson Inc. Messaging application having a plurality of interfacing capabilities
US6393465B2 (en) * 1997-11-25 2002-05-21 Nixmail Corporation Junk electronic mail detector and eliminator
US6381592B1 (en) * 1997-12-03 2002-04-30 Stephen Michael Reuning Candidate chaser
WO1999032985A1 (en) * 1997-12-22 1999-07-01 Accepted Marketing, Inc. E-mail filter and method thereof
US6023723A (en) * 1997-12-22 2000-02-08 Accepted Marketing, Inc. Method and system for filtering unwanted junk e-mail utilizing a plurality of filtering mechanisms
US6052709A (en) * 1997-12-23 2000-04-18 Bright Light Technologies, Inc. Apparatus and method for controlling delivery of unsolicited electronic mail
US5999932A (en) * 1998-01-13 1999-12-07 Bright Light Technologies, Inc. System and method for filtering unsolicited electronic mail messages using data matching and heuristic processing
US5968117A (en) * 1998-01-20 1999-10-19 Aurora Communications Exchange Ltd. Device and system to facilitate accessing electronic mail from remote user-interface devices
US6157630A (en) * 1998-01-26 2000-12-05 Motorola, Inc. Communications system with radio device and server
US6119124A (en) * 1998-03-26 2000-09-12 Digital Equipment Corporation Method for clustering closely resembling data objects
US6161130A (en) * 1998-06-23 2000-12-12 Microsoft Corporation Technique which utilizes a probabilistic classifier to detect "junk" e-mail by automatically updating a training and re-training the classifier based on the updated training set
US6192360B1 (en) * 1998-06-23 2001-02-20 Microsoft Corporation Methods and apparatus for classifying text and for building a text classifier
US6314454B1 (en) * 1998-07-01 2001-11-06 Sony Corporation Method and apparatus for certified electronic mail messages
US6226630B1 (en) * 1998-07-22 2001-05-01 Compaq Computer Corporation Method and apparatus for filtering incoming information using a search engine and stored queries defining user folders
US6275850B1 (en) * 1998-07-24 2001-08-14 Siemens Information And Communication Networks, Inc. Method and system for management of message attachments
US6112227A (en) * 1998-08-06 2000-08-29 Heiner; Jeffrey Nelson Filter-in method for reducing junk e-mail
US6654787B1 (en) * 1998-12-31 2003-11-25 Brightmail, Incorporated Method and apparatus for filtering e-mail
US6732149B1 (en) * 1999-04-09 2004-05-04 International Business Machines Corporation System and method for hindering undesired transmission or receipt of electronic messages
US20040073617A1 (en) * 2000-06-19 2004-04-15 Milliken Walter Clark Hash-based systems and methods for detecting and preventing transmission of unwanted e-mail
US6931433B1 (en) * 2000-08-24 2005-08-16 Yahoo! Inc. Processing of unsolicited bulk electronic communication
US6965919B1 (en) * 2000-08-24 2005-11-15 Yahoo! Inc. Processing of unsolicited bulk electronic mail
US8219620B2 (en) * 2001-02-20 2012-07-10 Mcafee, Inc. Unwanted e-mail filtering system including voting feedback
US7275089B1 (en) * 2001-03-15 2007-09-25 Aws Convergence Technologies, Inc. System and method for streaming of dynamic weather content to the desktop
US20040044791A1 (en) * 2001-05-22 2004-03-04 Pouzzner Daniel G. Internationalized domain name system with iterative conversion
US7076527B2 (en) * 2001-06-14 2006-07-11 Apple Computer, Inc. Method and apparatus for filtering email
US20040204988A1 (en) * 2001-11-16 2004-10-14 Willers Howard Francis Interactively communicating selectively targeted information with consumers over the internet
US8046832B2 (en) * 2002-06-26 2011-10-25 Microsoft Corporation Spam detector with challenges
US20040083270A1 (en) * 2002-10-23 2004-04-29 David Heckerman Method and system for identifying junk e-mail
US6732157B1 (en) * 2002-12-13 2004-05-04 Networks Associates Technology, Inc. Comprehensive anti-spam system, method, and computer program product for filtering unwanted e-mail messages
US8266215B2 (en) * 2003-02-20 2012-09-11 Sonicwall, Inc. Using distinguishing properties to classify messages
US7219148B2 (en) * 2003-03-03 2007-05-15 Microsoft Corporation Feedback loop for spam prevention
US20060168006A1 (en) * 2003-03-24 2006-07-27 Mr. Marvin Shannon System and method for the classification of electronic communication
US7320020B2 (en) * 2003-04-17 2008-01-15 The Go Daddy Group, Inc. Mail server probability spam filter
US7483947B2 (en) * 2003-05-02 2009-01-27 Microsoft Corporation Message rendering for identification of content features
US20050132197A1 (en) 2003-05-15 2005-06-16 Art Medlar Method and apparatus for a character-based comparison of documents
US8145710B2 (en) * 2003-06-18 2012-03-27 Symantec Corporation System and method for filtering spam messages utilizing URL filtering module
US7941490B1 (en) 2004-05-11 2011-05-10 Symantec Corporation Method and apparatus for detecting spam in email messages and email attachments
JP2006293573A (en) * 2005-04-08 2006-10-26 Yaskawa Information Systems Co Ltd Electronic mail processor, electronic mail filtering method and electronic mail filtering program
US7739337B1 (en) 2005-06-20 2010-06-15 Symantec Corporation Method and apparatus for grouping spam email messages
US8010609B2 (en) * 2005-06-20 2011-08-30 Symantec Corporation Method and apparatus for maintaining reputation lists of IP addresses to detect email spam

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926812A (en) * 1996-06-20 1999-07-20 Mantra Technologies, Inc. Document extraction and comparison method with applications to automatic personalized database searching
US6199103B1 (en) * 1997-06-24 2001-03-06 Omron Corporation Electronic mail determination method and system and storage medium
US6804667B1 (en) * 1999-11-30 2004-10-12 Ncr Corporation Filter for checking for duplicate entries in database
US6460050B1 (en) * 1999-12-22 2002-10-01 Mark Raymond Pace Distributed content identification system
US7080123B2 (en) * 2001-09-20 2006-07-18 Sun Microsystems, Inc. System and method for preventing unnecessary message duplication in electronic mail
US20030195937A1 (en) * 2002-04-16 2003-10-16 Kontact Software Inc. Intelligent message screening

Cited By (124)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110055343A1 (en) * 2003-05-15 2011-03-03 Symantec Corporation Method and apparatus for filtering email spam using email noise reduction
US8402102B2 (en) 2003-05-15 2013-03-19 Symantec Corporation Method and apparatus for filtering email spam using email noise reduction
US8984289B2 (en) 2003-09-08 2015-03-17 Sonicwall, Inc. Classifying a message based on fraud indicators
US8191148B2 (en) * 2003-09-08 2012-05-29 Sonicwall, Inc. Classifying a message based on fraud indicators
US20100095378A1 (en) * 2003-09-08 2010-04-15 Jonathan Oliver Classifying a Message Based on Fraud Indicators
US8661545B2 (en) 2003-09-08 2014-02-25 Sonicwall, Inc. Classifying a message based on fraud indicators
US8548170B2 (en) 2003-12-10 2013-10-01 Mcafee, Inc. Document de-registration
US8301635B2 (en) 2003-12-10 2012-10-30 Mcafee, Inc. Tag data structure for maintaining relational data over captured objects
US7814327B2 (en) 2003-12-10 2010-10-12 Mcafee, Inc. Document registration
US7984175B2 (en) 2003-12-10 2011-07-19 Mcafee, Inc. Method and apparatus for data capture and analysis system
US7899828B2 (en) 2003-12-10 2011-03-01 Mcafee, Inc. Tag data structure for maintaining relational data over captured objects
US8166307B2 (en) 2003-12-10 2012-04-24 McAffee, Inc. Document registration
US7774604B2 (en) 2003-12-10 2010-08-10 Mcafee, Inc. Verifying captured objects before presentation
US9374225B2 (en) 2003-12-10 2016-06-21 Mcafee, Inc. Document de-registration
US9092471B2 (en) 2003-12-10 2015-07-28 Mcafee, Inc. Rule parser
US20050132046A1 (en) * 2003-12-10 2005-06-16 De La Iglesia Erik Method and apparatus for data capture and analysis system
US8762386B2 (en) 2003-12-10 2014-06-24 Mcafee, Inc. Method and apparatus for data capture and analysis system
US20050132079A1 (en) * 2003-12-10 2005-06-16 Iglesia Erik D.L. Tag data structure for maintaining relational data over captured objects
US20050177725A1 (en) * 2003-12-10 2005-08-11 Rick Lowe Verifying captured objects before presentation
US20050127171A1 (en) * 2003-12-10 2005-06-16 Ahuja Ratinder Paul S. Document registration
US8656039B2 (en) 2003-12-10 2014-02-18 Mcafee, Inc. Rule parser
US8271794B2 (en) 2003-12-10 2012-09-18 Mcafee, Inc. Verifying captured objects before presentation
US20050131876A1 (en) * 2003-12-10 2005-06-16 Ahuja Ratinder Paul S. Graphical user interface for capture system
US20050166066A1 (en) * 2004-01-22 2005-07-28 Ratinder Paul Singh Ahuja Cryptographic policy enforcement
US7930540B2 (en) 2004-01-22 2011-04-19 Mcafee, Inc. Cryptographic policy enforcement
US8307206B2 (en) 2004-01-22 2012-11-06 Mcafee, Inc. Cryptographic policy enforcement
US20050204005A1 (en) * 2004-03-12 2005-09-15 Purcell Sean E. Selective treatment of messages based on junk rating
US20050273614A1 (en) * 2004-06-07 2005-12-08 Ahuja Ratinder P S Generating signatures over a document
US7434058B2 (en) * 2004-06-07 2008-10-07 Reconnex Corporation Generating signatures over a document
US7962591B2 (en) 2004-06-23 2011-06-14 Mcafee, Inc. Object classification in a capture system
US7660865B2 (en) * 2004-08-12 2010-02-09 Microsoft Corporation Spam filtering with probabilistic secure hashes
US20060036693A1 (en) * 2004-08-12 2006-02-16 Microsoft Corporation Spam filtering with probabilistic secure hashes
US8560534B2 (en) 2004-08-23 2013-10-15 Mcafee, Inc. Database for a capture system
US8707008B2 (en) 2004-08-24 2014-04-22 Mcafee, Inc. File system for a capture system
US7949849B2 (en) 2004-08-24 2011-05-24 Mcafee, Inc. File system for a capture system
US20060047675A1 (en) * 2004-08-24 2006-03-02 Rick Lowe File system for a capture system
US8396897B2 (en) * 2004-11-22 2013-03-12 International Business Machines Corporation Method, system, and computer program product for threading documents using body text analysis
US20060112120A1 (en) * 2004-11-22 2006-05-25 International Business Machines Corporation Method, system, and computer program product for threading documents using body text analysis
US7596700B2 (en) * 2004-12-22 2009-09-29 Storage Technology Corporation Method and system for establishing trusting environment for sharing data between mutually mistrusting entities
US20060136730A1 (en) * 2004-12-22 2006-06-22 Charles Milligan Method and system for establishing trusting environment for sharing data between mutually mistrusting entities
US20090193018A1 (en) * 2005-05-09 2009-07-30 Liwei Ren Matching Engine With Signature Generation
US8171002B2 (en) * 2005-05-09 2012-05-01 Trend Micro Incorporated Matching engine with signature generation
US8001193B2 (en) * 2005-05-17 2011-08-16 Ntt Docomo, Inc. Data communications system and data communications method for detecting unsolicited communications
US20060262867A1 (en) * 2005-05-17 2006-11-23 Ntt Docomo, Inc. Data communications system and data communications method
US7739337B1 (en) * 2005-06-20 2010-06-15 Symantec Corporation Method and apparatus for grouping spam email messages
US20070036156A1 (en) * 2005-08-12 2007-02-15 Weimin Liu High speed packet capture
US8730955B2 (en) 2005-08-12 2014-05-20 Mcafee, Inc. High speed packet capture
US7907608B2 (en) 2005-08-12 2011-03-15 Mcafee, Inc. High speed packet capture
US7818326B2 (en) 2005-08-31 2010-10-19 Mcafee, Inc. System and method for word indexing in a capture system and querying thereof
US20070050334A1 (en) * 2005-08-31 2007-03-01 William Deninger Word indexing in a capture system
US8554774B2 (en) 2005-08-31 2013-10-08 Mcafee, Inc. System and method for word indexing in a capture system and querying thereof
US8176049B2 (en) 2005-10-19 2012-05-08 Mcafee Inc. Attributes of captured objects in a capture system
US8463800B2 (en) 2005-10-19 2013-06-11 Mcafee, Inc. Attributes of captured objects in a capture system
US7730011B1 (en) 2005-10-19 2010-06-01 Mcafee, Inc. Attributes of captured objects in a capture system
US20100185622A1 (en) * 2005-10-19 2010-07-22 Mcafee, Inc. Attributes of Captured Objects in a Capture System
US8200026B2 (en) 2005-11-21 2012-06-12 Mcafee, Inc. Identifying image type in a capture system
US20070116366A1 (en) * 2005-11-21 2007-05-24 William Deninger Identifying image type in a capture system
US7657104B2 (en) 2005-11-21 2010-02-02 Mcafee, Inc. Identifying image type in a capture system
EP1997281A4 (en) * 2006-03-09 2010-08-25 Watchguard Technologies Inc Method and sytem for recognizing desired email
EP1997281A1 (en) * 2006-03-09 2008-12-03 Borderware Technologies Inc. Method and sytem for recognizing desired email
US8572190B2 (en) 2006-03-09 2013-10-29 Watchguard Technologies, Inc. Method and system for recognizing desired email
EP1837784A1 (en) * 2006-03-23 2007-09-26 Canon Kabushiki Kaisha Document management apparatus, document management system, control method of the apparatus and system, program, and storage medium
US20070223050A1 (en) * 2006-03-23 2007-09-27 Canon Kabushiki Kaisha Document management apparatus, document management system, control method of the apparatus and system, program, and storage medium
US8504537B2 (en) 2006-03-24 2013-08-06 Mcafee, Inc. Signature distribution in a document registration system
US8010689B2 (en) 2006-05-22 2011-08-30 Mcafee, Inc. Locational tagging in a capture system
US8307007B2 (en) 2006-05-22 2012-11-06 Mcafee, Inc. Query generation for a capture system
US8683035B2 (en) 2006-05-22 2014-03-25 Mcafee, Inc. Attributes of captured objects in a capture system
US7689614B2 (en) 2006-05-22 2010-03-30 Mcafee, Inc. Query generation for a capture system
US9094338B2 (en) 2006-05-22 2015-07-28 Mcafee, Inc. Attributes of captured objects in a capture system
US8005863B2 (en) 2006-05-22 2011-08-23 Mcafee, Inc. Query generation for a capture system
US7958227B2 (en) 2006-05-22 2011-06-07 Mcafee, Inc. Attributes of captured objects in a capture system
US20080028468A1 (en) * 2006-07-28 2008-01-31 Sungwon Yi Method and apparatus for automatically generating signatures in network security systems
US20090089266A1 (en) * 2007-09-27 2009-04-02 Microsoft Corporation Method of finding candidate sub-queries from longer queries
US7765204B2 (en) * 2007-09-27 2010-07-27 Microsoft Corporation Method of finding candidate sub-queries from longer queries
US20090089383A1 (en) * 2007-09-30 2009-04-02 Tsuen Wan Ngan System and method for detecting content similarity within emails documents employing selective truncation
US20090089384A1 (en) * 2007-09-30 2009-04-02 Tsuen Wan Ngan System and method for detecting content similarity within email documents by sparse subset hashing
US8037145B2 (en) 2007-09-30 2011-10-11 Symantec Operating Corporation System and method for detecting email content containment
US20090089539A1 (en) * 2007-09-30 2009-04-02 Guy Barry Owen Bunker System and method for detecting email content containment
US8275842B2 (en) * 2007-09-30 2012-09-25 Symantec Operating Corporation System and method for detecting content similarity within email documents by sparse subset hashing
US8601537B2 (en) 2008-07-10 2013-12-03 Mcafee, Inc. System and method for data mining and security policy management
US8635706B2 (en) 2008-07-10 2014-01-21 Mcafee, Inc. System and method for data mining and security policy management
US8205242B2 (en) 2008-07-10 2012-06-19 Mcafee, Inc. System and method for data mining and security policy management
US10367786B2 (en) 2008-08-12 2019-07-30 Mcafee, Llc Configuration management for a capture/registration system
US9253154B2 (en) 2008-08-12 2016-02-02 Mcafee, Inc. Configuration management for a capture/registration system
US8850591B2 (en) 2009-01-13 2014-09-30 Mcafee, Inc. System and method for concept building
US8706709B2 (en) 2009-01-15 2014-04-22 Mcafee, Inc. System and method for intelligent term grouping
US9602548B2 (en) 2009-02-25 2017-03-21 Mcafee, Inc. System and method for intelligent state management
US9195937B2 (en) 2009-02-25 2015-11-24 Mcafee, Inc. System and method for intelligent state management
US8473442B1 (en) 2009-02-25 2013-06-25 Mcafee, Inc. System and method for intelligent state management
US8447722B1 (en) 2009-03-25 2013-05-21 Mcafee, Inc. System and method for data mining and security policy management
US9313232B2 (en) 2009-03-25 2016-04-12 Mcafee, Inc. System and method for data mining and security policy management
US8918359B2 (en) 2009-03-25 2014-12-23 Mcafee, Inc. System and method for data mining and security policy management
US8667121B2 (en) 2009-03-25 2014-03-04 Mcafee, Inc. System and method for managing data and policies
US20100246547A1 (en) * 2009-03-26 2010-09-30 Samsung Electronics Co., Ltd. Antenna selecting apparatus and method in wireless communication system
CN101853260A (en) * 2009-04-01 2010-10-06 赛门铁克公司 System and method for detecting e-mail content
US8458268B1 (en) * 2010-02-22 2013-06-04 Symantec Corporation Systems and methods for distributing spam signatures
US9794254B2 (en) 2010-11-04 2017-10-17 Mcafee, Inc. System and method for protecting specified data combinations
US10313337B2 (en) 2010-11-04 2019-06-04 Mcafee, Llc System and method for protecting specified data combinations
US11316848B2 (en) 2010-11-04 2022-04-26 Mcafee, Llc System and method for protecting specified data combinations
US8806615B2 (en) 2010-11-04 2014-08-12 Mcafee, Inc. System and method for protecting specified data combinations
US10666646B2 (en) 2010-11-04 2020-05-26 Mcafee, Llc System and method for protecting specified data combinations
US9407463B2 (en) * 2011-07-11 2016-08-02 Aol Inc. Systems and methods for providing a spam database and identifying spam communications
US8954458B2 (en) 2011-07-11 2015-02-10 Aol Inc. Systems and methods for providing a content item database and identifying content items
US8700561B2 (en) 2011-12-27 2014-04-15 Mcafee, Inc. System and method for providing data protection workflows in a network environment
US9430564B2 (en) 2011-12-27 2016-08-30 Mcafee, Inc. System and method for providing data protection workflows in a network environment
US8935783B2 (en) * 2013-03-08 2015-01-13 Bitdefender IPR Management Ltd. Document classification using multiscale text fingerprints
US20150089644A1 (en) * 2013-03-08 2015-03-26 Bitdefender IPR Management Ltd. Document Classification Using Multiscale Text Fingerprints
RU2632408C2 (en) * 2013-03-08 2017-10-04 БИТДЕФЕНДЕР АйПиАр МЕНЕДЖМЕНТ ЛТД Classification of documents using multilevel signature text
US20140259157A1 (en) * 2013-03-08 2014-09-11 Bitdefender IPR Management Ltd. Document Classification Using Multiscale Text Fingerprints
US9203852B2 (en) * 2013-03-08 2015-12-01 Bitdefender IPR Management Ltd. Document classification using multiscale text fingerprints
US10496900B2 (en) * 2013-05-08 2019-12-03 Seagate Technology Llc Methods of clustering computational event logs
US20140334739A1 (en) * 2013-05-08 2014-11-13 Xyratex Technology Limited Methods of clustering computational event logs
US11704583B2 (en) 2014-05-20 2023-07-18 Yahoo Assets Llc Machine learning and validation of account names, addresses, and/or identifiers
US9928465B2 (en) * 2014-05-20 2018-03-27 Oath Inc. Machine learning and validation of account names, addresses, and/or identifiers
US20150339583A1 (en) * 2014-05-20 2015-11-26 Aol Inc. Machine learning and validation of account names, addresses, and/or identifiers
US10789537B2 (en) 2014-05-20 2020-09-29 Oath Inc. Machine learning and validation of account names, addresses, and/or identifiers
US10657267B2 (en) * 2014-12-05 2020-05-19 GeoLang Ltd. Symbol string matching mechanism
US10049208B2 (en) * 2015-12-03 2018-08-14 Bank Of America Corporation Intrusion assessment system
US11128712B2 (en) 2015-12-31 2021-09-21 Axon Enterprise, Inc. Systems and methods for filtering messages
US10594795B2 (en) 2015-12-31 2020-03-17 Axon Enterprise, Inc. Systems and methods for filtering messages
US11553041B2 (en) 2015-12-31 2023-01-10 Axon Enterprise, Inc. Systems and methods for filtering messages
WO2017116741A1 (en) * 2015-12-31 2017-07-06 Taser International, Inc. Systems and methods for filtering messages
EP4020254A1 (en) * 2020-12-23 2022-06-29 Cylance Inc. Statistical data fingerprinting and tracing data similarity of documents
US11430244B2 (en) 2020-12-23 2022-08-30 Cylance Inc. Statistical data fingerprinting and tracing data similarity of documents

Also Published As

Publication number Publication date
WO2004105332A3 (en) 2005-03-10
WO2004105332A2 (en) 2004-12-02
EP1649645A2 (en) 2006-04-26
US7831667B2 (en) 2010-11-09
WO2004105332A9 (en) 2005-12-15
US20050108339A1 (en) 2005-05-19
JP4598774B2 (en) 2010-12-15
US20050108340A1 (en) 2005-05-19
US20110055343A1 (en) 2011-03-03
JP2007503660A (en) 2007-02-22
TW200527863A (en) 2005-08-16
TWI348851B (en) 2011-09-11
US8402102B2 (en) 2013-03-19

Similar Documents

Publication Publication Date Title
US8402102B2 (en) Method and apparatus for filtering email spam using email noise reduction
US8145710B2 (en) System and method for filtering spam messages utilizing URL filtering module
US7739337B1 (en) Method and apparatus for grouping spam email messages
US10042919B2 (en) Using distinguishing properties to classify messages
KR101045452B1 (en) Advanced spam detection techniques
EP1738519B1 (en) Method and system for url-based screening of electronic communications
US7941490B1 (en) Method and apparatus for detecting spam in email messages and email attachments
US20130173562A1 (en) Simplifying Lexicon Creation in Hybrid Duplicate Detection and Inductive Classifier System
US20050015626A1 (en) System and method for identifying and filtering junk e-mail messages or spam based on URL content
US7624274B1 (en) Decreasing the fragility of duplicate document detecting algorithms
US9246860B2 (en) System, method and computer program product for gathering information relating to electronic content utilizing a DNS server
US20050289239A1 (en) Method and an apparatus to classify electronic communication
Kranakis Combating Spam
Nor Improving Antispam Techniques by Embracing Pattern-based Filtering
JP2010092251A (en) Information processing apparatus, information processing method, and information processing program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SYMANTEC CORPORATION, CALIFORNIA

Free format text: MERGER;ASSIGNOR:BRIGHTMAIL, INC.;REEL/FRAME:016331/0026

Effective date: 20040618

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION