EP1606718A2 - Kommunikationsfilterung und prioritäten durch verwendung vorheriger kommunikation - Google Patents

Kommunikationsfilterung und prioritäten durch verwendung vorheriger kommunikation

Info

Publication number
EP1606718A2
EP1606718A2 EP04715665A EP04715665A EP1606718A2 EP 1606718 A2 EP1606718 A2 EP 1606718A2 EP 04715665 A EP04715665 A EP 04715665A EP 04715665 A EP04715665 A EP 04715665A EP 1606718 A2 EP1606718 A2 EP 1606718A2
Authority
EP
European Patent Office
Prior art keywords
party
prior
trust
communications
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04715665A
Other languages
English (en)
French (fr)
Other versions
EP1606718A4 (de
Inventor
Matthias Grossglauser
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GROSSGLAUSER, MATTIAS
Original Assignee
Grossglauser Mattias
Businger Peter A
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Grossglauser Mattias, Businger Peter A filed Critical Grossglauser Mattias
Publication of EP1606718A2 publication Critical patent/EP1606718A2/de
Publication of EP1606718A4 publication Critical patent/EP1606718A4/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/21Monitoring or handling of messages
    • H04L51/212Monitoring or handling of messages using filtering or selective blocking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/48Message addressing, e.g. address format or anonymous messages, aliases

Definitions

  • the invention relates to network communications and, more particularly, to filtering and prioritizing messages being communicated to an addressee.
  • Spammers have become increasingly sophisticated in identifying message targets, legitimately or otherwise, e.g. by "'mining" the web including user groups and the like, or even based on mere guesses, yielding email addresses of millions of users. To alleviate the burden on users and networks, measures are sought for inhibiting the spread of spam.
  • blacklists are central databases collecting reports of email addresses at which spam originates. In this regard there has been concern with legitimate addresses being falsely included, disrupting normal email operation. Individual users or organizations may also establish whitelists, i.e. collections of email addresses that they deem legitimate. Whitelists in general do not contain the large number of legitimate email addresses that may communicate with a recipient at some point, considered as too "remote" for explicit inclusion.
  • a concomitant automated technique disallows an originator to establish a communication with an intended recipient unless a trust relationship exists between the parties.
  • Establishment of the trust relationship can depend on past communications between originator and recipient, between such parties and third parties, and between third and fourth parties, for example. Other attributes may be taken into account, such as explicit user feedback and message content.
  • Fig. 1 is a trust graph illustrating an instance where a message should be communicated regularly, rather than treated as spam.
  • Fig. 2 is a trust graph illustrating an instance where a message should be rejected as spam.
  • Fig. 3 is a trust graph more generally illustrating establishment of different levels of trust.
  • nodes identified by upper-case letters represent users/email addresses
  • solid lines with an arrow represent past communications in the direction of the arrow.
  • a broken line with an arrow represents a potential/attempted communication on which a decision is to be made.
  • multiple broken-line arrows follow paths through the graph of prior communications, illustrating how multiple independent paths can build additional trust.
  • the figures represent illustrative examples, without limiting the ways in which trust can be derived from past communications.
  • the invention can be appreciated as based on the premise that past communication patterns between parties permit inference of a notion of trust between two or more parties of an attempted communication.
  • the inference can be automated to yield one or more indicators which, on a proposed communication, can be automatically interrogated in deciding whether or not the communication will be allowed to proceed, and under what conditions.
  • the communication e.g. delivery .of an email message, or the placing a phone call will be permitted automatically to proceed only if sufficient trust exists between the initiator(s) and the recipient(s) of the communication. In case of insufficient trust, effecting the communication may still be provided for, but requiring one or several additional action(s) by the initiating party or parties, e.g. payment of a fee.
  • Such a requirement can serve to discourage abusive or annoying communications, e.g. spam email, sales calls and the like.
  • insufficient trust may trigger alteration of the communication in some way, e.g. by compressing to use less space, or by removing attachments.
  • Using past communication patterns for estimating the trust between initiator(s) and recipient(s) of a new communication can reduce the likelihood that such actions will be invoked too frequently.
  • FIG. 1 there are prior communications 1 and 2 between users A and B establishing trust in both directions, and a communication 3 establishing trust from B to C.
  • A may trust a communication from C.
  • the relationship of trust can be appreciated in view of the graph on following contiguous solid lines in the direction opposite to their arrows. It may be noted that based on the instant situation C need not trust A, who may be a spammer still.
  • Fig. 3 illustrates how different notions of trust can be defined.
  • multiple broken-line arrows follow trusted paths through the graph of prior communications. Specifically, there is a path of five links from E via J, F, C, B to A, and two paths from I to A, namely I - G - F - C - B - A including five links, and I - G - H - B - A including four links.
  • trust between two or several parties depends not only on communications having occurred directly between these parties, but also between these and other parties.
  • a measure of trust can be established between two parties that have never communicated with each other, provided that trust can be established through one or several other intermediate parties. For example, if a user X wishes to send a message to a user Y, where X and Y have never communicated before, the communication may be allowed to proceed if there is information of a third party Z that has communicated with both X and Y in the past.
  • trust between X and Y may depend (i) on the frequency and timing of the past communications, (ii) more generally on the structure of the "trust graph" linking X and Y, and/or (iii) on some attributes and/or the content of the message, e.g. its size, media types, number of concurrent recipients, and the like.
  • the inferred trust relationships can be used in other ways as well, e.g., to prioritize, order and categorize communications by the sender and/or recipient. For example, email messages can automatically be grouped according to the degree/amount of trust between sender and recipient, e.g. by placement into respective "folders". Trust relationships can be used further to assist communicating parties in searching for addresses and other information related to the parties, e.g. by auto-completing an unrecognized identification such as an address or phone number with likely, i.e. highly trusted similar identification.
  • received emails can be displayed according to the strength of the trust relationship, or the relationship can be used to prioritize among several concurrent phone calls.
  • a measure of trust can be used to assist in other decision-making processes, e.g. fraud avoidance in e-commerce, targeting legitimate advertisement or other relevant information, searching for people based on criteria involving trust and personal relationships, allocating resources for future communications, and the like.
  • Indicators of trust relationships can be maintained centrally or in a distributed way, mindful of tradeoffs between control, scalability, the need for cryptographic methods for authentication, and the like. A further consideration is with the amount of control a user is afforded over his trust relationships. Where such control is desired, users may be allowed to explicitly manage and rate trust with other users. For example, users can provide feedback by rating the relevance or quality of messages they receive. Such feedback can be incorporated into the trust graph to further improve the filtering decisions.
  • Email messages can be filtered according to a trust relationship between the sender and the recipient(s) of the message.
  • email messages can be exchanged only between parties in a trust inference system.
  • the system can refuse to deliver a message if the sender does not have a sufficient trust relationship with the recipient. Refusal may be on account of one or more reasons such as e.g. (i) the sender has never sent a message to anybody, (ii) the sender has attempted to send undesired email messages to the present or other recipients before, or (iii) the parties with whom the sender has trust relationships in turn do not possess sufficiently strong trust relationships with the intended recipient.
  • the sender can have an option of a prescribed action to establish trust through other means.
  • the sender can be allowed to call the intended recipient on the phone, contact him/her through a traditional email message, or make a required payment to the recipient or a third party such as an internet service provider, for example.
  • Explicit Trust Manipulation In the trust system a user can be allowed to explicitly manipulate and modify its trust relationship with others, in combination with inference of trust relationships from past communications. Such a feature can simplify the establishment of communication between the one user and he others. For example, a user can establish a list of other users permitted to send him/her email messages.
  • a trust relationship inferred from past communication patterns can be complemented and enhanced by explicit feedback from parties on actual or attempted communications.
  • an email client can offer two ways of deleting a message, (a) a "normal" delete of a message that is not needed any more, and (b) a "reject" of a message that was deemed fraudulent, disruptive, or otherwise undesired.
  • a determination can be made by the intended recipient of the message, e.g. when the recipient is allowed to inspect the attempted communication and decide whether to reject it, to wait for sender action, or to permit delivery.
  • feedback can be generated at different stages, e.g. as a message remains pendent or after its delivery.
  • a "reject” can be used to lower the trust between the originator and the recipient of the message, and can be used even to affect the level of trust between other parties.
  • the parties of an attempted communication can be afforded a certain amount of control over the trust relationship with other parties, and over how the trust relationships lead to decisions on whether an attempted communication is accepted or not.
  • a user can be provided with control over a "trust threshold" parameter whose level determines whether or not the system lets attempted communications pass.
  • email messages from non-participating originators can be required to include a prescribed code or "cookie", e.g. in the subject field.
  • the code is generated when the originator first obtains permission to send to one of the participating recipients. Unless the originator includes the code when sending a message to any participating recipient, the message will be rejected.
  • an attacker now needs not only the address of a trusted legitimate originator, but also its code. Further variations include using sequences of codes / one-time passwords, encryption-based authentication methods, and the like.
  • a decentralized implementation can offer improved scaling and greater robustness to failures and attacks.
  • a decentralized implementation can extend an existing email environment, consisting of email clients, a message transfer agent such as the sendmail program, and the like, with software for interacting with its counterparts in other locations or domains to implement the same or similar functionality as described above for the centralized case.
  • parties can establish a-priori trust relationships by providing so-called "whitelists" of potential senders a recipient wishes to allow, and "blacklists" of senders to be blocked.
  • Such lists may be shared, e.g. as there are organizations collecting and distributing blacklists, and lists may be updated/modified based on user feedback on attempted or occurred communications.
  • Such trust relationships can be combined with trust inferred by observing past communications, e.g. as follows: If two users, A and B who trust each other share their whitelists, and if user A allows messages from a sender C, then B would automatically receive messages from C without himself having to whitelist C. Altering Communication Based on Trust.
  • a message can be restricted, e.g. so that larger messages are truncated, large attachments are cut off, execution of an attached program is denied, several messages from the sender are combined into one, and/or the like.
  • a message can be blocked and a request sent to the sender for action on his part, e.g. for reduction of the size of the message, providing authentication and/or the like.
  • Trust Policy may be set by an individual communicating party, there can be circumstances where the policy at least in part is set by another. In a company, for example, management can exercise a measure of control over how trust is established between its employees, and between outsiders and its employees.
  • a measure of trust can be determined between classes of parties, based on one or several attributes of the parties. For example, if the attribute is the user's domain, then trust can be inferred between a user A at one domain, Dl, and a user B at another domain, D2, provided other users in domains Dl and D2 have communicated before.
  • a measure of trust can be shared by members of a class, e.g. the employees of a company.
  • attributes for determining a class for trust sharing are geographic location, e.g. of cell/mobile phones and location-driven services, membership in a group or community, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Data Exchanges In Wide-Area Networks (AREA)
  • Information Transfer Between Computers (AREA)
EP04715665A 2003-02-27 2004-02-27 Kommunikationsfilterung und prioritäten durch verwendung vorheriger kommunikation Withdrawn EP1606718A4 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US45057903P 2003-02-27 2003-02-27
US450579P 2003-02-27
PCT/US2004/005867 WO2004077710A2 (en) 2003-02-27 2004-02-27 Minimizing unsolicited e-mail based on prior communications

Publications (2)

Publication Number Publication Date
EP1606718A2 true EP1606718A2 (de) 2005-12-21
EP1606718A4 EP1606718A4 (de) 2009-03-25

Family

ID=32927671

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04715665A Withdrawn EP1606718A4 (de) 2003-02-27 2004-02-27 Kommunikationsfilterung und prioritäten durch verwendung vorheriger kommunikation

Country Status (2)

Country Link
EP (1) EP1606718A4 (de)
WO (1) WO2004077710A2 (de)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7519674B2 (en) * 2006-09-01 2009-04-14 Nuxo Technologies, Inc. Method and apparatus for filtering electronic messages
GB2458094A (en) 2007-01-09 2009-09-09 Surfcontrol On Demand Ltd URL interception and categorization in firewalls
GB0709527D0 (en) * 2007-05-18 2007-06-27 Surfcontrol Plc Electronic messaging system, message processing apparatus and message processing method
US8255987B2 (en) 2009-01-15 2012-08-28 Microsoft Corporation Communication abuse prevention
US9241259B2 (en) 2012-11-30 2016-01-19 Websense, Inc. Method and apparatus for managing the transfer of sensitive information to mobile devices
US9798877B2 (en) * 2015-06-04 2017-10-24 Accenture Global Services Limited Security risk-based resource allocation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6192114B1 (en) * 1998-09-02 2001-02-20 Cbt Flint Partners Method and apparatus for billing a fee to a party initiating an electronic mail communication when the party is not on an authorization list associated with the party to whom the communication is directed
US20020083136A1 (en) * 2000-12-22 2002-06-27 Whitten William B. Method of authorizing receipt of instant messages by a recipient user
US20020124053A1 (en) * 2000-12-28 2002-09-05 Robert Adams Control of access control lists based on social networks
WO2002080512A1 (en) * 2001-03-30 2002-10-10 Elisa Communications Oyj Controlling method for contact requests using recommendations by approved contact-requesting parties
US20020199095A1 (en) * 1997-07-24 2002-12-26 Jean-Christophe Bandini Method and system for filtering communication

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6675153B1 (en) * 1999-07-06 2004-01-06 Zix Corporation Transaction authorization system
US20030009698A1 (en) * 2001-05-30 2003-01-09 Cascadezone, Inc. Spam avenger
US20030131063A1 (en) * 2001-12-19 2003-07-10 Breck David L. Message processor
US6842807B2 (en) * 2002-02-15 2005-01-11 Intel Corporation Method and apparatus for deprioritizing a high priority client
US20030216982A1 (en) * 2002-05-17 2003-11-20 Tyler Close Messaging gateway for incentivizing collaboration
US7219148B2 (en) * 2003-03-03 2007-05-15 Microsoft Corporation Feedback loop for spam prevention

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020199095A1 (en) * 1997-07-24 2002-12-26 Jean-Christophe Bandini Method and system for filtering communication
US6192114B1 (en) * 1998-09-02 2001-02-20 Cbt Flint Partners Method and apparatus for billing a fee to a party initiating an electronic mail communication when the party is not on an authorization list associated with the party to whom the communication is directed
US20020083136A1 (en) * 2000-12-22 2002-06-27 Whitten William B. Method of authorizing receipt of instant messages by a recipient user
US20020124053A1 (en) * 2000-12-28 2002-09-05 Robert Adams Control of access control lists based on social networks
WO2002080512A1 (en) * 2001-03-30 2002-10-10 Elisa Communications Oyj Controlling method for contact requests using recommendations by approved contact-requesting parties

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2004077710A2 *

Also Published As

Publication number Publication date
WO2004077710A3 (en) 2005-03-24
EP1606718A4 (de) 2009-03-25
WO2004077710A2 (en) 2004-09-10

Similar Documents

Publication Publication Date Title
US9083695B2 (en) Control and management of electronic messaging
EP1675333B1 (de) Entdeckung von unerwünschten elektronischen Nachrichten (Spam)
EP1523837B1 (de) Verfahren und vorrichtung zur verarbeitung von nachrichten in einem kommunikationsnetzwerk
US20040024823A1 (en) Email authentication system
AU782333B2 (en) Electronic message filter having a whitelist database and a quarantining mechanism
US7644274B1 (en) Methods of protecting against spam electronic mail
JP2005518173A (ja) Eメール管理サービス
Leiba et al. A Multifaceted Approach to Spam Reduction.
EP1606718A2 (de) Kommunikationsfilterung und prioritäten durch verwendung vorheriger kommunikation
JP4659096B2 (ja) 勝手に送り付けてくる好ましくない電子メッセージの配信をキー生成および比較によって防止するシステムと方法
KR100996709B1 (ko) 아이피 응용 스팸 차단 장치 및 방법
JP2003018324A (ja) 通信サービスにおけるユーザフィルタリングシステム及び方法
KR20020030704A (ko) 가상 전자우편주소를 이용한 스팸메일 방지 서비스 시스템및 그 방법
KR20100013989A (ko) VoIP 환경에서의 튜링 테스트 기반 스팸 차단 장치 및그 방법
US11916873B1 (en) Computerized system for inserting management information into electronic communication systems
Park et al. Spam Detection: Increasing Accuracy with A Hybrid Solution.
CN102598009A (zh) 一种用于筛选信息的方法及装置
Helman Spam-a-Lot: The States' Crusade against Unsolicited E-Mail in Light of the Can-Spam Act and the Overbreadth Doctrine
JP2009505216A (ja) 勝手に送り付けてくる好ましくない電子メッセージの検出およびフィルタリングを行うシステムと方法
Whitworth et al. Channel e-mail: a sociotechnical response to spam
WO2008122409A1 (en) Trust manager and method for enhanced protection against spam
Chaisamran et al. Trust-based SPIT detection by using call duration and social reliability
Shah et al. FLeSMA: a firewall level spam mitigation approach through a genetic classifier model
JP2003169095A (ja) 電子メールシステム及び電子メール配信方法
Gasmi et al. E-Mail Security as Cooperation Problem

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20050927

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20090223

RIC1 Information provided on ipc code assigned before grant

Ipc: H04L 12/58 20060101AFI20090217BHEP

17Q First examination report despatched

Effective date: 20111216

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: GROSSGLAUSER, MATTIAS

Owner name: GRUETER, RETO

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20170518

RIN1 Information on inventor provided before grant (corrected)

Inventor name: GROSSGLAUSER, MATTHIAS

GRAJ Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted

Free format text: ORIGINAL CODE: EPIDOSDIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTC Intention to grant announced (deleted)
INTG Intention to grant announced

Effective date: 20171012

INTG Intention to grant announced

Effective date: 20171012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180901