US8832437B2 - Stateless human detection for real-time messaging systems - Google Patents

Stateless human detection for real-time messaging systems Download PDF

Info

Publication number
US8832437B2
US8832437B2 US13/589,743 US201213589743A US8832437B2 US 8832437 B2 US8832437 B2 US 8832437B2 US 201213589743 A US201213589743 A US 201213589743A US 8832437 B2 US8832437 B2 US 8832437B2
Authority
US
United States
Prior art keywords
challenge
response
message
source
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US13/589,743
Other versions
US20120324535A1 (en
Inventor
Jeremy T. Buch
Vlad Eminovici
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US11/275,854 priority Critical patent/US8261071B2/en
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/589,743 priority patent/US8832437B2/en
Publication of US20120324535A1 publication Critical patent/US20120324535A1/en
Application granted granted Critical
Publication of US8832437B2 publication Critical patent/US8832437B2/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/107Computer aided management of electronic mail
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/12Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages with filtering and selective blocking capabilities

Abstract

Stateless human detection for real-time systems allows a real-time message system to challenge incoming messages suspected of being generated by an automated application. When a suspect message is detected, a challenge is presented to a sender of the message. The challenge is designed to require human intervention to provide a correct answer to the challenge. A challenge packet is sent with the challenge and includes a challenge answer and, possibly, a server identifier, a challenge identifier and/or a time stamp that can be used to prevent attacks on the challenge. The challenge packet is encrypted so that the sender cannot access the contents thereof. When the sender provides a response to the challenge, the sender returns the challenge packet. The challenge packet is decrypted and the challenge answer is compared to a sender answer. If the answers match, the sender is allowed subsequent access to the messaging system.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a Continuation of and claims benefit from U.S. patent application Ser. No. 11/275,854 that was filed on Jan. 31, 2006, and that is incorporated herein by reference in its entirety.

BACKGROUND

Messaging systems, such as e-mail systems, instant messaging systems, and the like, are susceptible to unwanted attacks in the form of spam, phishing, viruses, etc. Most of these attacks are carried out by automated systems that programmatically generate thousands of messages to legitimate systems, bombarding them with advertisements, messages containing viruses or deceptive information gathering messages, etc.

Non-real-time systems (e.g. e-mail systems) have sufficient time to adequately filter messages and can detect and delete most undesirable messages. But the problem becomes even more pronounced in real-time systems, such as an instant messaging system, because the real-time nature of the system prevents implementation of rigorous authorization logic.

One method that has been used to prevent automated attacks in non-real-time systems is to provide a challenge to a sender of a message that requires human intervention to answer. In some instances, the challenge is a graphic that includes a word jumbled to a point where automated character recognition cannot recognize the word, but a human can. A human can provide an appropriate response to the challenge for authorization where a machine cannot.

However, providing a challenge to suspect users in a real-time system that typically processes thousands of messages per second requires a prohibitive amount of overhead and processing in-band challenges can adversely affect the real-time performance required in such systems.

SUMMARY

The present description relates to stateless human detection for real-time systems, implementations are described that allow a real-time system to provide challenges that ensure a human user has originated a message sent to the system. Once a challenge is sent, to a sender, no state related to the challenge is maintained by the system, thereby overcoming the real-time overhead issues (i.e., memory and processing time) that have previously prevented implementation of such a challenge scheme in a real-time system.

When a real-time messaging system determines to issue a challenge to a sender of an incoming message, the system creates a challenge packet that, includes the challenge and an encrypted portion that can only be decrypted by the system that originates the challenge. The encrypted portion includes at least an answer to the challenge and may also include other data.

The sender of the original message responds to the challenge and the challenge packet is returned to the system with the sender response. When the real-time system receives the response to the challenge, the encrypted portion of the challenge packet is decrypted. Measures may be taken to ensure that the challenge is a challenge that originated from the system and that the original challenge has not been tampered with, if the challenge is a valid challenge, the sender response is compared to the challenge answer to determine if the challenge was successfully answered, if so, the sender is allowed to access the system.

In the following description, it will be seen that, no state is saved once a challenge has issued. The information required to identify the challenge and determine if the challenge was answered correctly is included in the challenge packet that is exchanged between the real-time system and the sender. Therefore, the information required to process the challenge response is received from the sender in the response to a challenge.

DESCRIPTION OF THE DRAWINGS

The present description references the following figures.

FIG. 1 is an exemplary prior art CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) challenge.

FIG. 2 is a diagram that shows an exemplary generalized data flow for the stateless human detection techniques described herein.

FIG. 3 is a block diagram showing an exemplary client-server configuration that implements a stateless human detection system.

FIG. 4 is a flow diagram that depicts an exemplary methodological implementation of a stateless human detection technique.

FIG. 5 is a block diagram depicting an exemplary computing environment in which the presently described techniques may be implemented.

DETAILED DESCRIPTION

The following description relates generally to real-time messaging systems. For discussion purposes, the following description relates specifically to an instant messaging system that is used to dynamically authorize and bridge federated networks. However, it is noted that one or more techniques described herein may be applied to one or more other types of real-time and or non-real-time systems.

Exemplary Challenge

In the following description, challenges are presented that are designed to require a human interaction to provide an answer to a challenge. A particular example utilized herein is a challenge known as a CAPTCHA, or Completely Automated Public Turing test to tell Computers and Humans Apart. A CAPTCHA is well known in the art and comprises a graphic that has a meaning decipherable only to a human being. The graphic is typically a word or series of alphanumeric characters that are skewed in alignment, dispersed among arbitrary markings, displayed over a distracting background, etc. so that optical character recognition techniques cannot determine what the characters are.

FIG. 1 depicts a prior art example of a CAPTCHA. In the example shown in FIG. 1, the CAPTCHA includes a string of alphanumeric characters consisting of the following characters: U95E. However, automated character recognition techniques would not be able to determine this particular string from the exemplary CAPTCHA because of the distortion of the image and the characters.

Other challenges may be utilized with the techniques described herein without departing from the scope of the present description and the claims appended hereto. As long as a challenge can be relied upon to require human interaction to provide a correct answer to the challenge, the challenge can be used as described herein.

Exemplary Generalized Data Flow

FIG. 2 is a diagram 200 that shows an exemplary generalized data flow for the stateless human detection techniques described herein. The diagram 200 is meant to provide an overview of the data transactions that generally occur in the stateless human detection techniques described herein. Further details of the stateless human detection techniques will be provided below, with respect to subsequent figures.

The techniques described herein contemplate a client 202 that attempts to access an instant messaging system server 204 by sending an IM (Instant Message) message 206 to the server 204. In the present example, the instant messaging system is a real-time messaging system that may be used to dynamically authorize and bridge federated networks. The client 202 may be a client within a same network as the server 204 or it may be a remote client 202 that is unknown to the server 204. Generally, the client 202 may be any computing device configured to transmit, an electronic message to the server 204.

The server 204 is configured to determine whether to issue a challenge in response to the IM Message 206 (block 208). In some instances, it may be desirable for the server 204 to challenge the client 202 to ensure that a message received from the client is not a message generated by an automated application, in other words, in an effort to prevent spam, viruses and the like, the server 204 may be configured to challenge messages that are not generated by a human user.

For example, if a message includes a URL (Universal Resource Locator) the server 204 may be configured to authorize a sender of the message since viruses can be propagated through the use of URLs that leverage a client API (Automated Programming Interface) of the instant messaging system to re-send the URL to all of the user's in a target's address book.

Likewise, the server 204 may be configured to challenge any message received from a previously unknown network, since spam can come in the form of automated systems that attempt to send messages to enterprises that would otherwise not receive messages from that particular source. However, because dynamic means of federation are enable, the sending system may be allowed.

Other message sources may be challenged by the server 204 for other reasons in accordance with the purposes of the systems, articles and methods described and claimed herein.

When the server 204 determines that a challenge should be issued in response to the reception of the message 206, the server 204 selects a challenge to transmit to the client 202 (block 210). In at least one implementation which is discussed herein, a challenge is selected from a stored challenge library. The challenge library is a fixed size data structure that stores at least a plurality of challenges and corresponding answers thereto, in one or more implementations described herein, the challenge library may also include a challenge identifier that is used to identify a particular challenge within the challenge library.

As will be seen as the present discussion progresses, the only data that the server stores to implement the challenge process is the data in the challenge library. No state is saved regarding a particular challenge or a message that caused the challenge to be sent. Therefore, the storage required for the described implementations is a fixed amount that is easier to manage than memory that is dynamically allocated.

A challenge packet that includes the challenge and any challenge parameters that may be transmitted with the challenge is assembled (block 212). The challenge parameters are encrypted by the server so that the client 202 cannot access the challenge parameters. The challenge parameters that may be included in the challenge packet include a challenge answer, a challenge identifier (as referenced above), a server identifier, a time stamp and the like. It is noted that different implementations may include different combinations of these or other challenge parameters. The server identifier uniquely identifies the server 204 so that a returned challenge can be verified as having originated with the server 204. The time stamp may be included to help prevent attacks on the challenge (e.g. replay attacks).

The challenge packet is transmitted to the client 202 (IM with Challenge Packet 214). The only portion of the challenge packet that is accessible at the client 202 is the challenge itself and a user interface that explains why the challenge is being presented and provides instructions for answering the challenge. The user interface may be a part of the client 202 or it may be included in the IM with Challenge Packet 214).

A human response (216) is necessary to provide a sender answer to the challenge. An IM with the challenge packet and the sender answer are provided in a sender response (IM Response with Challenge, Challenge Packet and Sender Answer 218) is transmitted back to the server 204. It is noted that since the server 204 did not retain any state regarding the challenge, the information regarding the challenge that the server requires to validate the challenge is contained in the sender response (218).

The server receives the sender response that includes the original challenge, the still-encrypted challenge parameters and the sender answer to the challenge (block 220). From the sender response, the server can decrypt the encrypted information and compare the challenge answer (now unencrypted) to the sender answer. If the challenge packet includes a server identifier, the source of the challenge can be validated (block 222) to prevent false challenge responses that are manufactured to attack a system. If the server is validated and the challenge answer matches the sender answer (block 224), then it can be reliably determined that the sender is not an automated application but is a human user.

The general steps outlined above are described in greater detail below, with respect to subsequent figures.

Exemplary Client-Server Configuration

FIG. 3 is a block diagram showing an exemplary client-server configuration 300 that implements a stateless human detection system as described herein. The client-server configuration 300 is a simplified diagram of a server 302 and a client 304 that communicate over a network 306. The exemplary diagram is not intended to show every element in a client-server configuration that is necessary to provide an operational system. Rather, the client server configuration 300 shows particular elements that are relevant to the present discussion. Those skilled in the art will realize other elements that may be included in the configuration or other arrangements of the configuration that may function similarly to the implementations described herein.

It is noted that although the client-server configuration 300 is shown with certain functionality attributed to particular elements, functions described herein may be allocated differently among the same or different elements than are actually depicted. The generalized configuration shown in FIG. 3 is meant only to describe a particular implementation of the systems and methods shown and described herein.

The server 302 includes a communications stack 308, such as a SIP (Session Initiation Protocol) stack, a system clock 310 and a messaging module 312. The messaging module 312 is configured to send messages to and receive messages from one or more other computing devices within an enterprise network in which the server 302 is situation or remote from the enterprise network.

The server 302 also includes challenge library 314 that stores a plurality of challenges 316 and corresponding challenge answers 318. The challenge library 314 is a data structure that has a fixed size. The exact size of the challenge library 314 is an implementation detail that depends on a design of the system and how many unique challenges 316 are deemed necessary, given the traffic expected in the system, the ration of suspect messages to trusted messages, etc.

In at least one implementation in accordance with the present description, the challenge module 314 also includes a sequence table 320 that serves as a challenge identifier to track which challenges 316 have been issued. Although described as a sequence table 320, this module can be configured in any of several ways. In one implementation, the sequence table 320 is a binary number that has a number of bits equal to a number of challenges 316 included in the challenge library.

When a challenge is issued, a sequence identifier corresponding to the set bit is included in a challenge packet that is sent to the sender of the original message. For example, if the sequence table 320 is a 128-bit binary number, a 128-bit number may be used as a sequence identifier where only one of the bits is a “1” and the other bits are “0”. The location of the “1” corresponds to a particular challenge in the challenge library and thus identifies the challenge.

When a response to a challenge is received, a bit in the sequence table that corresponds to a sequence identifier included in the challenge response is set. For example, if a sequence identifier is a 128-bit binary number that begins with “1000 . . . ” then a high-order bit in the sequence table is set (i.e. changed from “0” to “1”). If that bit is already set, then it indicates that this is a second response to the challenge, which could indicate an attack on the system (e.g. repeatedly sending response to a challenge to try to get one correct by chance).

Challenge responses may also be invalidated if they are received too late, i.e. after a certain period of time has elapsed since the challenge was issued. A delayed response could indicate that a hacker or a computer had spent, some time trying to illegitimately formulate a response to a challenge. Timing out a delayed response can prevent this from occurring.

A time stamp can be utilized with the sequence table to allow the sequence table to be reset and challenges to be re-used. For example, if the system is configured to time-out challenges after sixty seconds, sequence table processing can be configured to allow a challenge to be re-used after more than sixty seconds have elapsed since the challenge was originally issued. This can be done in one of many ways, such as by resetting a bit set in the sequence table after an expiration time associated with a challenge corresponding to that bit has elapsed.

The server 302 also includes a cryptographic module 322 that is configured to encrypt and decrypt messages or portion thereof with one or more keys 324—or certificates—associated therewith. For example, when the server 302 issues a challenge to the client 304, the challenge is included in a challenge packet that also includes other data that is encrypted. In at least one implementation, the other data includes a correct answer to the challenge. Since the purpose of issuing the challenge would be defeated if the client could access the answer, the cryptographic module 322 of the server 302 encrypts the other data in the challenge packet (i.e. the answer) so that the client 304 cannot access the other data.

In one or more other implementations, the entire challenge packet—including the challenge itself—may be encrypted. It is also noted that the cryptographic module 322 may also be used to encrypt other types of messages.

In addition to encryption, the cryptographic module 322 is also configured to decrypt messages or portions of messages. For example, since the stateless nature of the techniques described herein do not provide for saving any state related to a challenge or a challenged message, challenge parameters are included in a challenge packet that is transmitted to the client 304. When the challenge parameters are returned in a subsequent message, the cryptographic module 322 decrypts the challenge parameters.

The client 304—also referred to herein as a sender—includes a communications stack 326, a clock 328 and a messaging module 330 that serve basically the same functions as outlined as similar elements in the server 302. The client 304 also includes a cryptographic module 332 that is used to decrypt challenges received from the server 302, in the event that challenges from the server 302 are encrypted.

In at least one implementation according to the present description, the challenge packet sent from the server 302 to the client 304 is encrypted. Such encryption provides an additional layer of security that encrypts the challenge itself as well as the challenge parameters included in the challenge packet. It is noted that the challenge packet is encrypted with a private key that is unavailable to the client 304 and, therefore, this part of the message cannot be decrypted by the client 304. The cryptographic module 330 is configured to decrypt the challenge portion of the challenge packet if the challenge portion is encrypted, if the challenge portion is encrypted, then the server 302 makes a certificate, or key, available to the client 304 that allows the client to decrypt the challenge.

The client 304 also includes a display 334 that is configured to display a user interface 336 and a challenge graphic 338. In at least one implementation, the challenge graphic is a CAPTCHA. The user interface 336 displays text associated with the challenge graphic and explains to a user where the challenge originates, why it is being sent and instructions for processing the challenge. The user interface 336 may be a custom user interface designed specifically to receive such challenges, or the user interface 336 may be a generic user interface 336 that can display text, and/or graphics that, are sent from the server 302.

The example shown in FIG. 3 depicts exemplary messages that, are sent between the server 302 and the client 304. A first exemplary message 340 sent from the server 302 to the client 304 in response to the server 302 receiving a message from a client, 304 includes a challenge and a set of challenge parameters. The set of challenge parameters is encrypted with a private certificate associated with the server 302. The challenge parameters include a sequence number associated with the challenge, a challenge answer, a time stamp associated with the message and a server identifier.

Although all of the listed challenge parameters are not required in all implementations of the described techniques, one or more of the challenge parameters shown in the example are used to prevent unauthorized attacks on the challenge. For example, the sequence number may be used to verify a particular challenge when it is returned. Furthermore, the time stamp may be used to verify that the challenge is answered within a predefined time period that is short enough to prevent a user from taking the challenge and figuring out a way to circumvent the challenge. Also, the server identifier may be used to verify that a challenge did indeed originate from the server 302.

A second exemplary message 342 is sent from the client 304 to the server 302 in response to the first exemplary message 340. The second exemplary message 342 includes the still-encrypted challenge parameters and a sender answer (i.e. a client user's response to the challenge). In at least one alternative implementation, the challenge is also returned to the server 302 in this message.

The second exemplary message 342 contains the information required for the server 302 to verify the challenge answer and approve or deny authorization for the client 304 to access the server system. Unlike present systems, which would require the server 302 to maintain state related to issued challenges, the described system allows the server 302 to proceed without storing any state related to the challenge and/or the messages 340, 342. The server 302 can utilize the challenge parameters included with a message returned from the client 304 to verify: (1) that the challenge was answered correctly; (2) that the challenge originated with the server 302 and was not created by a hacker; (3) that the message 342 is a timely response to a server 302 challenge; and (4) that the challenge is not an old challenged being used in a replay attack on the server 302.

Further details of the server 302 and client 304, their elements and the functionality associated therewith are described below.

Exemplary Methodological Implementation: Stateless Human Detection

FIG. 4 is a flow diagram 400 that depicts an exemplary methodological implementation of stateless human detection. It is noted that the described implementation is exemplary only and that the steps described in relation to the flow diagram 400 may be implemented in a different order than shown, or with more or fewer steps than shown, without departing from the spirit and scope of the claims appended hereto.

In the following discussion, continuing reference is made to the elements and reference numerals shown and described with respect to one or more previous figures.

At block 402, the server 302 receives an instant message (IM) message from the client 304. The message is checked to determine whether the message includes a response to a previous challenge (block 404). In at least one implementation, all incoming messages are checked to determine if they contain a response to a challenge. It is noted that decryption of at least a portion of the message may be required to make this determination. However, some implementations may not require decryption at this point.

If the message does not include a response to a previous challenge (“No” branch, block 404), then the server 302 makes a determination as to whether the message requires a challenge (block 406).

As previously discussed, the server 302 is configured to present a challenge to certain messages. Although not required, the logic for handling this determination is can typically be implemented in the communications stack 308 of the server 302. The server 302 can be configured to send a challenge upon receipt of a message that: is from an unknown source (i.e. a client in a remote network from which the server has not previously received a message); that includes a URL; is randomly selected to be challenge; or that is suspected at any point where prevention of automated messages is desirable or for any reason that the message may be suspected of being automatically generated.

If the server 302 determines that a challenge to the message is not required (“No” branch, block 406), then the message is forwarded to an intended recipient or otherwise processed normally, if the server 302 determines that a challenge is required, the server 302 creates a challenge packet at block 408.

As previously described, the challenge packet (212, FIG. 2) includes at least a challenge 316 selected from the challenge library 314 and a challenge answer 316 associated with the selected challenge 316. The challenge answer 316 is encrypted with a private certificate 324 associated with the server 302 so that the challenge answer 316 cannot be decrypted by any other party.

The challenge packet 212 may also include other challenge parameters, such as one of: a challenge identifier (e.g. a sequence identifier as described above); a time stamp; a server identifier; etc.).

A time stamp may be included and challenge responses may be required to be completed within a certain time period. This can prevent a hacker from having sufficient time to manually figure an answer to a challenge and resubmit the challenge to gain access to the server. The time period must be defined according to the practical properties of the system so that premature time-outs don't cause unwarranted rejections. Sufficient time should be given to allow for message transmission and user interaction without providing enough time for a hacker to manipulate the system.

A server identifier that uniquely identifies the server may be included in the challenge packet. When verifying a challenge response, the server 302 may use the server identifier to verify that the challenge originated with the server. The server identifier, as previously noted, is encrypted and therefore cannot be tampered with. This prevents a hacker from presenting a challenge response that did not originate with the server so that the hacker can present an illegitimate challenge response to gain access to the server 302.

After creating the challenge packet, the server 302 transmits the challenge packet to the client at block 410.

If the incoming message includes a response to a previous challenge (“Yes” branch, block 404), then the cryptographic module 320 of the server 302 decrypts any encrypted portion(s) of the message at block 412 and attempts to validate the challenge and a sender (client) response to the challenge at block 414.

The validation may include from one to several steps. In its simplest form, the validation consists merely of comparing a challenge answer 318 with a sender answer received from the client 302. As previously discussed, a sequence number, a time stamp, a server identifier and/or other validation means may be used to validate the challenge and answer.

If the response is a valid response, i.e. the challenge is valid and originated with the server, the response is returned within a predefined time period, etc. (“Yes” branch, block 414), then the IM message is processed as requested at block 416. If the response includes an invalid challenge or an incorrect answer to the challenge, the IM message is rejected (block 418).

Exemplary Operating Environment

FIG. 5 and the following discussion are intended to provide a brief, general description of a suitable computing environment in which all or a portion of a worm containment system may be implemented. The operating environment of FIG. 5 is only one example of a suitable operating environment and is not intended to suggest any limitation as to the scope of use or functionality of the operating environment. Other well known computing systems, environments, and/or configurations that may be suitable for use as a worm containment system described herein include, but are not limited to, personal computers, hand-held or laptop devices, multiprocessor systems, micro-processor based systems, programmable consumer electronics, network personal computers, server computers, mini computers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.

Although not required, the worm containment, system will be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc., that, perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various environments. In a distributed environment, program modules may be located in both local and remote computer-storage media including memory-storage devices.

With reference to FIG. 5, an exemplary system for implementing a worm containment system includes a computing device, such as computing device 500. In its most basic configuration, computing device 500 typically includes at least one processing unit 502 and memory 504. Depending on the exact configuration and type of computing device, memory 504 may be volatile (such as RAM), non-volatile (such as ROM, flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in FIG. 5 by dashed line 506. Additionally, device 500 may also have additional features and/or functionality. For example, device 500 may also include additional storage (e.g., removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated in FIG. 5 by removable storage 508 and non-removable storage 510. Computer storage media includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Memory 504, removable storage 508, and non-removable storage 510 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 500. Any such computer storage media may be part of device 500.

Device 500 may also contain communication connection(s) 512 that allow the device 500 to communicate with other computing devices, such as other nodes within a computing system network 511. Communications connection(s) 512 is an example of communication media. Communication media typically embodies computer readable instructions, data structures or program modules. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency, infrared, and other wireless media.

Device 500 may also have input device(s) 514 such as keyboard, mouse, pen, voice input device, touch input device, laser range finder, infra-red cameras, video input devices, and/or any other input device. Output device(s) 516 such as display, speakers, printer, and/or any other output device may also be included.

In the description that follows, the present invention is described with reference to acts and symbolic representations of operations that are performed by one or more computing devices, unless indicated otherwise. As such, it will be understood that such acts and operations, which are at times referred to as being computer-executed, include the manipulation by the processing unit of the computing device of electrical signals representing data in a structured form. This manipulation transforms the data or maintains them at locations in the memory system of the computing device, which reconfigures or otherwise alters the operation of the device in a manner well understood by those skilled in the art. While the following description is described in the foregoing context, it is not meant to be limiting as those of skill in the art will appreciate that, various of the acts and operations described hereinafter may also be implemented in hardware. For example, by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.

The data structures where data are maintained are physical locations of the memory that have particular properties defined by the format of the data. Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or distributively process by executing some software instructions at the local terminal and some at the remote computer (or computer network).

CONCLUSION

While exemplary implementations and applications of the code module initialization have been illustrated and described, it is to be understood that the invention is not limited to the precise configuration and resources described above. Various modifications, changes, and variations apparent to those skilled in the art may be made in the arrangement, operation, and details of the methods and systems of the present, invention disclosed herein without departing from the scope of the invention, as both described above and claimed below.

Claims (20)

We claim:
1. A method performed on a computing device, the method comprising:
receiving a message from a source;
determining if the received message is a response to a previously transmitted challenge message;
if the received message is the response, then processing the received message; and
if the received message is not the response, then:
creating, by the computing device based on the received message, a challenge message comprising an unencrypted portion that can be accessed by the source and that includes a challenge directed to the source, and that further comprises an encrypted portion that cannot be accessed by the source and that includes a correct answer to the challenge and an identifier of the computing device, and
transmitting the created challenge message to the source.
2. The method of claim 1 further comprising decrypting, if the received message is the response, an encrypted portion of the response that matches an encrypted portion of the previously transmitted challenge message.
3. The method of claim 2 where the determining that the response is a valid response comprises determining that a first identifier of the decrypted portion identifies the computing device.
4. The method of claim 2 where the determining that the response is a valid response comprises determining that a second identifier of the decrypted portion identifies the previously transmitted challenge message.
5. The method of claim 2 where the determining that the response is a valid response comprises determining that an answer indicated by the response matches a correct answer indicated by the decrypted portion.
6. The method of claim 1 where the creating is further based on the source being an unknown source to the computing device.
7. The method of claim 1 further comprising rejecting, if the received message is the response, the received message in response to determining that the received response is not a valid response to the previously transmitted challenge message.
8. At least one computer storage device storing computer-executable instruction that, when executed by a computing device, cause the computing device to perform actions comprising:
receiving a message from a source;
determining if the received message is a response to a previously transmitted challenge message;
if the received message is the response, then processing the received message; and
if the received message is not the response, then:
creating, by the computing device based on the received message, a challenge message comprising an unencrypted portion that can be accessed by the source and that includes a challenge directed to the source, and that further comprises an encrypted portion that cannot be accessed by the source and that includes a correct answer to the challenge and an identifier of the computing device, and
transmitting the created challenge message to the source.
9. The at least one computer storage device of claim 8, the actions further comprising decrypting, if the received message is the response, an encrypted portion of the response that matches an encrypted portion of the previously transmitted challenge message.
10. The at least one computer storage device of claim 9 where the determining that the response is a valid response comprises determining that a first identifier of the decrypted portion identifies the computing device.
11. The at least one computer storage device of claim 9 where the determining that the response is a valid response comprises determining that a second identifier of the decrypted portion identifies the previously transmitted challenge message.
12. The at least one computer storage device of claim 9 where the determining that the response is a valid response comprises determining that an answer indicated by the response matches a correct answer indicated by the decrypted portion.
13. The at least one computer storage device of claim 8 where the creating is further based on the source being an unknown source to the computing device.
14. The at least one computer storage device of claim 8, the actions further comprising rejecting, if the received message is the response, the message in response to determining that the received response is not a valid response to the previously transmitted challenge message.
15. A system comprising a computing device, a memory, and at least one program module that are together configured for performing actions comprising:
receiving a message from a source;
determining if the received message is a response to a previously transmitted challenge message;
if the received message is the response, then processing the received message; and
if the received message is not the response, then:
creating, by the computing device based on the received message, a challenge message comprising an unencrypted portion that can be accessed by the source and that includes a challenge directed to the source, and that further comprises an encrypted portion that cannot be accessed by the source and that includes a correct answer to the challenge and an identifier of the computing device, and
transmitting the created challenge message to the source.
16. The system of claim 15, the actions further comprising decrypting, if the received message is the response, an encrypted portion of the response that matches an encrypted portion of the previously transmitted challenge message.
17. The system of claim 16 where the determining that the response is a valid response comprises determining that a first identifier of the decrypted portion identifies the computing device.
18. The system of claim 16 where the determining that the response is valid comprises determining that a second identifier of the decrypted portion identifies the transmitted challenge message.
19. The system of claim 16 where the determining that the received response is a valid response comprises determining that an answer indicated by the response matches a correct answer indicated by the decrypted portion.
20. The system of claim 15 where the creating is further based on the source being an unknown source to the computing device.
US13/589,743 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems Active US8832437B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/275,854 US8261071B2 (en) 2006-01-31 2006-01-31 Stateless human detection for real-time messaging systems
US13/589,743 US8832437B2 (en) 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/589,743 US8832437B2 (en) 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/275,854 Continuation US8261071B2 (en) 2006-01-31 2006-01-31 Stateless human detection for real-time messaging systems

Publications (2)

Publication Number Publication Date
US20120324535A1 US20120324535A1 (en) 2012-12-20
US8832437B2 true US8832437B2 (en) 2014-09-09

Family

ID=38323289

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/275,854 Active 2028-11-26 US8261071B2 (en) 2006-01-31 2006-01-31 Stateless human detection for real-time messaging systems
US13/589,743 Active US8832437B2 (en) 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems
US13/589,885 Active US8826018B2 (en) 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/275,854 Active 2028-11-26 US8261071B2 (en) 2006-01-31 2006-01-31 Stateless human detection for real-time messaging systems

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/589,885 Active US8826018B2 (en) 2006-01-31 2012-08-20 Stateless human detection for real-time messaging systems

Country Status (1)

Country Link
US (3) US8261071B2 (en)

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9037516B2 (en) 2006-03-17 2015-05-19 Fatdoor, Inc. Direct mailing in a geo-spatial environment
US8965409B2 (en) 2006-03-17 2015-02-24 Fatdoor, Inc. User-generated community publication in an online neighborhood social network
US9064288B2 (en) 2006-03-17 2015-06-23 Fatdoor, Inc. Government structures and neighborhood leads in a geo-spatial environment
US9071367B2 (en) 2006-03-17 2015-06-30 Fatdoor, Inc. Emergency including crime broadcast in a neighborhood social network
US9459622B2 (en) 2007-01-12 2016-10-04 Legalforce, Inc. Driverless vehicle commerce network and community
US9373149B2 (en) 2006-03-17 2016-06-21 Fatdoor, Inc. Autonomous neighborhood vehicle commerce network and community
US9070101B2 (en) 2007-01-12 2015-06-30 Fatdoor, Inc. Peer-to-peer neighborhood delivery multi-copter and method
US9002754B2 (en) 2006-03-17 2015-04-07 Fatdoor, Inc. Campaign in a geo-spatial environment
US8036902B1 (en) * 2006-06-21 2011-10-11 Tellme Networks, Inc. Audio human verification
US20080189292A1 (en) * 2007-02-02 2008-08-07 Jed Stremel System and method for automatic population of a contact file with contact content and expression content
US8296373B2 (en) 2007-02-02 2012-10-23 Facebook, Inc. Automatically managing objectionable behavior in a web-based social network
US8549651B2 (en) * 2007-02-02 2013-10-01 Facebook, Inc. Determining a trust level in a social network environment
US9098545B2 (en) 2007-07-10 2015-08-04 Raj Abhyanker Hot news neighborhood banter in a geo-spatial social network
US8495727B2 (en) * 2007-08-07 2013-07-23 Microsoft Corporation Spam reduction in real time communications by human interaction proof
US20090119475A1 (en) * 2007-11-01 2009-05-07 Microsoft Corporation Time based priority modulus for security challenges
US7512978B1 (en) * 2008-02-24 2009-03-31 International Business Machines Corporation Human-read-only configured e-mail
US20090235178A1 (en) * 2008-03-12 2009-09-17 International Business Machines Corporation Method, system, and computer program for performing verification of a user
US20090248799A1 (en) * 2008-03-31 2009-10-01 Telefonaktiebolaget Lm Ericsson (Publ) Method and server for user identifier update
US20090292924A1 (en) * 2008-05-23 2009-11-26 Johnson Erik J Mechanism for detecting human presence using authenticated input activity
US8132255B2 (en) * 2008-06-16 2012-03-06 Intel Corporation Generating a challenge response image including a recognizable image
WO2010134862A1 (en) * 2009-05-20 2010-11-25 Telefonaktiebolaget L M Ericsson (Publ) Challenging a first terminal intending to communicate with a second terminal
US8365260B2 (en) * 2009-09-25 2013-01-29 International Business Machines Corporation Multi-variable challenge and response for content security
US8966254B2 (en) * 2010-10-11 2015-02-24 International Business Machines Corporation Keyless challenge and response system
WO2013022839A1 (en) * 2011-08-05 2013-02-14 M-Qube, Inc. Method and system for verification of human presence at a mobile device
US20130263230A1 (en) * 2012-03-30 2013-10-03 Anchorfree Inc. Method and system for statistical access control with data aggregation
CN103684981B (en) * 2012-09-21 2017-12-01 腾讯科技(深圳)有限公司 Instant interactive communication method, system and server
US9465927B2 (en) * 2012-10-02 2016-10-11 Disney Enterprises, Inc. Validating input by detecting and recognizing human presence
KR101764197B1 (en) 2013-06-27 2017-08-02 인텔 코포레이션 Continuous multi-factor authentication
US9439367B2 (en) 2014-02-07 2016-09-13 Arthi Abhyanker Network enabled gardening with a remotely controllable positioning extension
US9457901B2 (en) 2014-04-22 2016-10-04 Fatdoor, Inc. Quadcopter with a printable payload extension system and method
US9004396B1 (en) 2014-04-24 2015-04-14 Fatdoor, Inc. Skyteboard quadcopter and method
US9022324B1 (en) 2014-05-05 2015-05-05 Fatdoor, Inc. Coordination of aerial vehicles through a central server
US9441981B2 (en) 2014-06-20 2016-09-13 Fatdoor, Inc. Variable bus stops across a bus route in a regional transportation network
US9971985B2 (en) 2014-06-20 2018-05-15 Raj Abhyanker Train based community
US9451020B2 (en) 2014-07-18 2016-09-20 Legalforce, Inc. Distributed communication of independent autonomous vehicles to provide redundancy and performance
US9825928B2 (en) * 2014-10-22 2017-11-21 Radware, Ltd. Techniques for optimizing authentication challenges for detection of malicious attacks
US9460288B2 (en) 2014-12-08 2016-10-04 Shape Security, Inc. Secure app update server and secure application programming interface (“API”) server
US10073964B2 (en) 2015-09-25 2018-09-11 Intel Corporation Secure authentication protocol systems and methods
CN105871793A (en) * 2015-11-06 2016-08-17 乐视移动智能信息技术(北京)有限公司 Resource sharing method and device
US10345818B2 (en) 2017-05-12 2019-07-09 Autonomy Squared Llc Robot transport method with transportation container

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636280A (en) * 1994-10-31 1997-06-03 Kelly; Tadhg Dual key reflexive encryption security system
US6377691B1 (en) * 1996-12-09 2002-04-23 Microsoft Corporation Challenge-response authentication and key exchange for a connectionless security protocol
US6546416B1 (en) 1998-12-09 2003-04-08 Infoseek Corporation Method and system for selectively blocking delivery of bulk electronic mail
US20030093480A1 (en) 2001-11-15 2003-05-15 International Business Machines Corporation Accessing information using an instant messaging system
US20040139152A1 (en) 2003-01-10 2004-07-15 Kaler Christopher G. Performing generic challenges in a distributed system
US7072865B2 (en) 2000-06-30 2006-07-04 Kabushiki Kaisha Toshiba Broadcast receiving method and apparatus and information distributing method and apparatus
US7100054B2 (en) * 2001-08-09 2006-08-29 American Power Conversion Computer network security system
US20070033102A1 (en) * 2005-03-29 2007-02-08 Microsoft Corporation Securely providing advertising subsidized computer usage
US7373509B2 (en) * 2003-12-31 2008-05-13 Intel Corporation Multi-authentication for a computing device connecting to a network

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636280A (en) * 1994-10-31 1997-06-03 Kelly; Tadhg Dual key reflexive encryption security system
US6377691B1 (en) * 1996-12-09 2002-04-23 Microsoft Corporation Challenge-response authentication and key exchange for a connectionless security protocol
US6546416B1 (en) 1998-12-09 2003-04-08 Infoseek Corporation Method and system for selectively blocking delivery of bulk electronic mail
US7072865B2 (en) 2000-06-30 2006-07-04 Kabushiki Kaisha Toshiba Broadcast receiving method and apparatus and information distributing method and apparatus
US7100054B2 (en) * 2001-08-09 2006-08-29 American Power Conversion Computer network security system
US20030093480A1 (en) 2001-11-15 2003-05-15 International Business Machines Corporation Accessing information using an instant messaging system
US20040139152A1 (en) 2003-01-10 2004-07-15 Kaler Christopher G. Performing generic challenges in a distributed system
US7373509B2 (en) * 2003-12-31 2008-05-13 Intel Corporation Multi-authentication for a computing device connecting to a network
US20070033102A1 (en) * 2005-03-29 2007-02-08 Microsoft Corporation Securely providing advertising subsidized computer usage

Non-Patent Citations (13)

* Cited by examiner, † Cited by third party
Title
"Captcha", Computer Desktop Encyclopedia, Carnegie Mellon School of Computer Science, 2003.
Aura, Tuomas, and Nikander. "Stateless Connections (Stateless)", Lecture Notes in Computer Science 1334/1997(1997): 87-97. Print.
Giang, "Linked List", Programming and Data Structures, Aug. 4, 2002. .
Giang, "Linked List", Programming and Data Structures, Aug. 4, 2002. <http://isg.cs.tcd.ie/giangtlLinked—lists.pdf>.
Graham, Paul. "A Plan for Spam." Paul Graham. Aug. 2002. .
Graham, Paul. "A Plan for Spam." Paul Graham. Aug. 2002. <http://www.paulgraham.com/spam.html>.
Liu, Z., Lin, W., Li, N., & Lee, D. (Nov. 2005). Detecting and filtering instant messaging spam-a global and personalized approach. In Secure Network Protocols, 2005.(NPSec). 1st IEEE ICNP Workshop on (pp. 19-24). IEEE. *
Liu, Z., Lin, W., Li, N., & Lee, D. (Nov. 2005). Detecting and filtering instant messaging spam—a global and personalized approach. In Secure Network Protocols, 2005.(NPSec). 1st IEEE ICNP Workshop on (pp. 19-24). IEEE. *
Martz, David. "Spam filtering techniques", Developer Works, Sep. 1, 2002. .
Martz, David. "Spam filtering techniques", Developer Works, Sep. 1, 2002. <http://www.ibm.com/developerworks/linux/library/lspamf.html >.
Menezes, Alfred J., Paul C. Van Oorschot, and Scott A. Vanstone, "Handbook of Applied Cryptography", CRC Press, 1996. Print.
Ulanoff, "Will Challenge/Response Save Us from Spam?" Jul. 28, 2003.
Wagner, "How Spammers Will Beat Challenge-Response Systems, and Other Conversations About Spam (Wagner's Weblog)", Information Week Jun. 13, 2003.

Also Published As

Publication number Publication date
US20120324224A1 (en) 2012-12-20
US20120324535A1 (en) 2012-12-20
US8261071B2 (en) 2012-09-04
US20070179905A1 (en) 2007-08-02
US8826018B2 (en) 2014-09-02

Similar Documents

Publication Publication Date Title
Kahate Cryptography and network security
Barrett et al. SSH, The Secure Shell: The Definitive Guide: The Definitive Guide
US8423758B2 (en) Method and apparatus for packet source validation architecture system for enhanced internet security
CA2731462C (en) System and method for in- and out-of-band multi-factor server-to-user authentication
US7577987B2 (en) Operation modes for user authentication system based on random partial pattern recognition
US8813181B2 (en) Electronic verification systems
US6732279B2 (en) Anti-virus protection system and method
EP2442212B1 (en) Online data encryption and decryption
Goldberg Privacy-enhancing technologies for the internet III: ten years later
US9003484B2 (en) Method, apparatus, signals and medium for enforcing compliance with a policy on a client computer
EP1434408B1 (en) Authentication system and method based upon random partial pattern recognition
KR101133829B1 (en) Verifying authenticity of webpages
EP2304636B1 (en) Mobile device assisted secure computer network communications
EP1980047B1 (en) Online data encryption and decryption
US20060212520A1 (en) Electronic message system with federation of trusted senders
US6105137A (en) Method and apparatus for integrity verification, authentication, and secure linkage of software modules
US8447970B2 (en) Securing out-of-band messages
US20070162961A1 (en) Identification authentication methods and systems
EP1062762B1 (en) Message content protection and conditional disclosure
CA2739313C (en) Locally stored phishing countermeasure
US7127740B2 (en) Monitoring system for a corporate network
US8621221B1 (en) Method and system for event notification for wireless PDA devices
US7024690B1 (en) Protected mutual authentication over an unsecured wireless communication channel
US10157280B2 (en) System and method for identifying security breach attempts of a website
US20060005033A1 (en) System and method for secure communications between at least one user device and a network entity

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541

Effective date: 20141014

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4