US20130275282A1 - Anonymous billing - Google Patents

Anonymous billing Download PDF

Info

Publication number
US20130275282A1
US20130275282A1 US13652478 US201213652478A US2013275282A1 US 20130275282 A1 US20130275282 A1 US 20130275282A1 US 13652478 US13652478 US 13652478 US 201213652478 A US201213652478 A US 201213652478A US 2013275282 A1 US2013275282 A1 US 2013275282A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
user
provider
service
relying party
billing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13652478
Inventor
Ronald John Kamiel Euphrasia Bjones
Kim Cameron
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for supporting authentication of entities communicating through a packet data network
    • H04L63/0884Network architectures or network communication protocols for network security for supporting authentication of entities communicating through a packet data network by delegation of authentication, e.g. a proxy authenticates an entity to be authenticated on behalf of this entity vis-à-vis an authentication entity
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6263Protecting personal data, e.g. for financial or medical purposes during internet communication, e.g. revealing personal data from cookies
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/04Billing or invoicing, e.g. tax processing in connection with a sale
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/04Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks
    • H04L63/0407Network architectures or network communication protocols for network security for providing a confidential data exchange among entities communicating through data packet networks wherein the identity of one or more communicating identities is hidden
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for supporting authentication of entities communicating through a packet data network
    • H04L63/0807Network architectures or network communication protocols for network security for supporting authentication of entities communicating through a packet data network using tickets, e.g. Kerberos
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/02Network architectures or network communication protocols for network security for separating internal from external traffic, e.g. firewalls
    • H04L63/0281Proxies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for supporting authentication of entities communicating through a packet data network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/10Network architectures or network communication protocols for network security for controlling access to network resources

Abstract

Aspects of the subject matter described herein relate to billing for transactions involving a claims provider. In aspects, in conjunction with presenting a claim to a relying party, billing information is provided to a billing service. The billing information may include information to identify a claims provider that provided the claim and information that identifies the relying party. The information does not include data that can be used to determine the natural identity of a user that presented the claim. In response, a count is updated that can be used for billing. The count is not usable to determine the natural identities of users that presented claims to the relying party.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application No. 61/625,641, filed Apr. 17, 2012, entitled IDENTITY, which application is incorporated herein in its entirety.
  • BACKGROUND
  • For many individuals, there is a great concern that their activities with entities on the Web may be tracked and linked to them. With sufficient identifying information, a criminal entity may be able to fake an identity and use it in harmful ways. Companies have tried to address this issue by developing various secure systems. Unfortunately, such systems are often too cumbersome or non-intuitive for users. Furthermore, such systems may allow a company to track activities of individuals on the Web. This leads to mistrust and poor adoption of such systems.
  • The subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
  • SUMMARY
  • Briefly, aspects of the subject matter described herein relate to billing for transactions involving a claims provider. In aspects, in conjunction with presenting a claim to a relying party, billing information is provided to a billing service. The billing information may include information to identify a claims provider that provided the claim and information that identifies the relying party. The information does not include data that can be used to determine the natural identity of a user that presented the claim. In response, a count is updated that can be used for billing. The count is not usable to determine the natural identities of users that presented claims to the relying party.
  • This Summary is provided to briefly identify some aspects of the subject matter that is further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • The phrase “subject matter described herein” refers to subject matter described in the Detailed Description unless the context clearly indicates otherwise. The term “aspects” should be read as “at least one aspect.” Identifying aspects of the subject matter described in the Detailed Description is not intended to identify key or essential features of the claimed subject matter.
  • The aspects described above and other aspects of the subject matter described herein are illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram representing an exemplary general-purpose computing environment into which aspects of the subject matter described herein may be incorporated;
  • FIGS. 2-6 are block diagrams that represent exemplary environments in which aspects of the subject matter described herein may operate;
  • FIGS. 7-8 are timing diagrams in accordance with aspects of the subject matter described herein;
  • FIG. 9 is a flow diagram that generally represents exemplary actions that may occur in collecting and using billing information in accordance with aspects of the subject matter described herein; and
  • FIG. 10 is a flow diagram that generally represents exemplary actions that may occur in providing billing information to a billing service in accordance with aspects of the subject matter described herein.
  • DETAILED DESCRIPTION Definitions
  • The phrase “subject matter described herein” refers to subject matter described in the Detailed Description unless the context clearly indicates otherwise. The term “aspects” should be read as “at least one aspect.” Identifying aspects of the subject matter described in the Detailed Description is not intended to identify key or essential features of the claimed subject matter.
  • As used herein, the term “includes” and its variants are to be read as open-ended terms that mean “includes, but is not limited to.” The term “or” is to be read as “and/or” unless the context clearly dictates otherwise. The term “based on” is to be read as “based at least in part on.” The terms “one embodiment” and “an embodiment” are to be read as “at least one embodiment.” The term “another embodiment” is to be read as “at least one other embodiment.”
  • As used herein, terms such as “a,” “an,” and “the” are inclusive of one or more of the indicated item or action. In particular, in the claims a reference to an item generally means at least one such item is present and a reference to an action means at least one instance of the action is performed.
  • Sometimes herein the terms “first”, “second”, “third” and so forth may be used. Without additional context, the use of these terms in the claims is not intended to imply an ordering but is rather used for identification purposes. For example, the phrases “first version” and “second version” do not necessarily mean that the first version is the very first version or was created before the second version or even that the first version is requested or operated on before the second version. Rather, these phrases are used to identify different versions.
  • Headings are for convenience only; information on a given topic may be found outside the section whose heading indicates that topic.
  • Other definitions, explicit and implicit, may be included below.
  • Exemplary Operating Environment
  • FIG. 1 illustrates an example of a suitable computing system environment 100 on which aspects of the subject matter described herein may be implemented. The computing system environment 100 is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of aspects of the subject matter described herein. Neither should the computing environment 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment 100.
  • Aspects of the subject matter described herein are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, or configurations that may be suitable for use with aspects of the subject matter described herein comprise personal computers, server computers—whether on bare metal or as virtual machines—, hand-held or laptop devices, multiprocessor systems, microcontroller-based systems, set-top boxes, programmable and non-programmable consumer electronics, network PCs, minicomputers, mainframe computers, personal digital assistants (PDAs), gaming devices, printers, appliances including set-top, media center, or other appliances, automobile-embedded or attached computing devices, other mobile devices, phone devices including cell phones, wireless phones, and wired phones, distributed computing environments that include any of the above systems or devices, and the like.
  • Aspects of the subject matter described herein may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. Aspects of the subject matter described herein may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
  • Alternatively, or in addition, the functionally described herein may be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
  • With reference to FIG. 1, an exemplary system for implementing aspects of the subject matter described herein includes a general-purpose computing device in the form of a computer 110. A computer may include any electronic device that is capable of executing an instruction. Components of the computer 110 may include a processing unit 120, a system memory 130, and a system bus 121 that couples various system components including the system memory to the processing unit 120. The system bus 121 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus, Peripheral Component Interconnect Extended (PCI-X) bus, Advanced Graphics Port (AGP), and PCI express (PCIe).
  • The processing unit 120 may be connected to a hardware security device 122. The security device 122 may store and be able to generate cryptographic keys that may be used to secure various aspects of the computer 110. In one embodiment, the security device 122 may comprise a Trusted Platform Module (TPM) chip, TPM Security Device, or the like.
  • The computer 110 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer 110 and includes both volatile and nonvolatile media, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media. Computer-readable media does not include communication media.
  • Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes RAM, ROM, EEPROM, solid state storage, flash memory or other memory technology, CD-ROM, digital versatile discs (DVDs) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 110. Computer storage media does not include communication media.
  • The system memory 130 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 131 and random access memory (RAM) 132. A basic input/output system 133 (BIOS), containing the basic routines that help to transfer information between elements within computer 110, such as during start-up, is typically stored in ROM 131. RAM 132 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 120. By way of example, and not limitation, FIG. 1 illustrates operating system 134, application programs 135, other program modules 136, and program data 137.
  • The computer 110 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 1 illustrates a hard disk drive 141 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 151 that reads from or writes to a removable, nonvolatile magnetic disk 152, and an optical disc drive 155 that reads from or writes to a removable, nonvolatile optical disc 156 such as a CD ROM, DVD, or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include magnetic tape cassettes, flash memory cards and other solid state storage devices, digital versatile discs, other optical discs, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 141 may be connected to the system bus 121 through the interface 140, and magnetic disk drive 151 and optical disc drive 155 may be connected to the system bus 121 by an interface for removable nonvolatile memory such as the interface 150.
  • The drives and their associated computer storage media, discussed above and illustrated in FIG. 1, provide storage of computer-readable instructions, data structures, program modules, and other data for the computer 110. In FIG. 1, for example, hard disk drive 141 is illustrated as storing operating system 144, application programs 145, other program modules 146, and program data 147. Note that these components can either be the same as or different from operating system 134, application programs 135, other program modules 136, and program data 137. Operating system 144, application programs 145, other program modules 146, and program data 147 are given different numbers herein to illustrate that, at a minimum, they are different copies.
  • A user may enter commands and information into the computer 110 through input devices such as a keyboard 162 and pointing device 161, commonly referred to as a mouse, trackball, or touch pad. Other input devices (not shown) may include a microphone (e.g., for inputting voice or other audio), joystick, game pad, satellite dish, scanner, a touch-sensitive screen, a writing tablet, a camera (e.g., for inputting gestures or other visual input), or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
  • Through the use of one or more of the above-identified input devices a Natural User Interface (NUI) may be established. A NUI, may rely on speech recognition, touch and stylus recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, machine intelligence, and the like. Some exemplary NUI technology that may be employed to interact with a user include touch sensitive displays, voice and speech recognition, intention and goal understanding, motion gesture detection using depth cameras (such as stereoscopic camera systems, infrared camera systems, RGB camera systems, and combinations thereof), motion gesture detection using accelerometers/gyroscopes, facial recognition, 3D displays, head, eye, and gaze tracking, immersive augmented reality and virtual reality systems, as well as technologies for sensing brain activity using electric field sensing electrodes (EEG and related methods).
  • A monitor 191 or other type of display device is also connected to the system bus 121 via an interface, such as a video interface 190. In addition to the monitor, computers may also include other peripheral output devices such as speakers 197 and printer 196, which may be connected through an output peripheral interface 195.
  • The computer 110 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 180. The remote computer 180 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 110, although only a memory storage device 181 has been illustrated in FIG. 1. The logical connections depicted in FIG. 1 include a local area network (LAN) 171 and a wide area network (WAN) 173, but may also include phone networks, near field networks, and other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet.
  • When used in a LAN networking environment, the computer 110 is connected to the LAN 171 through a network interface or adapter 170. When used in a WAN networking environment, the computer 110 may include a modem 172 or other means for establishing communications over the WAN 173, such as the Internet. The modem 172, which may be internal or external, may be connected to the system bus 121 via the user input interface 160 or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 110, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 1 illustrates remote application programs 185 as residing on memory device 181. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • Anonymous Billing
  • As mentioned previously, privacy on the Web is a sensitive issue. FIGS. 2-6 are block diagrams that represent exemplary environments in which aspects of the subject matter described herein may operate. The entities illustrated in FIGS. 2-6 are exemplary and are not meant to be all-inclusive of entities that may be needed or included. In other embodiments, the entities and/or functions described in conjunction with FIGS. 2-6 may be included in other entities (shown or not shown) or placed in sub entities without departing from the spirit or scope of aspects of the subject matter described herein. In some embodiments, the entities and/or services described in conjunction with FIGS. 2-6 may be distributed across multiple devices.
  • One or more of the entities illustrated in FIGS. 2-6 may be implemented by one or more computing devices. Computing devices may include one or more personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microcontroller-based systems, set-top boxes, programmable and non-programmable consumer electronics, network PCs, minicomputers, mainframe computers, cell phones, personal digital assistants (PDAs), gaming devices, printers, appliances including set-top, media center, or other appliances, automobile-embedded or attached computing devices, other mobile devices, distributed computing environments that include any of the above systems or devices, and the like. An exemplary device that may be configured to act as one or more of the entities of the system 205 comprises the computer 110 of FIG. 1.
  • Where a line connects one entity to another or where two entities are found in the same figure, it is to be understood that the two entities may be connected (e.g., logically, physically, virtually, or otherwise) via any type of network including a direct connection, a local network, a non-local network, the Internet, some combination of the above, and the like. For example, a line may represent one or more local area networks, wide area networks, direct connections, virtual connections, private networks, virtual private networks, some combination of the above, and the like.
  • One or more of the entities illustrated in FIGS. 2-6 may be implemented in a virtual environment. A virtual environment is an environment that is simulated or emulated by a computer. The virtual environment may simulate or emulate a physical machine, operating system, set of one or more interfaces, portions of the above, combinations of the above, or the like. When a machine is simulated or emulated, the machine is sometimes called a virtual machine. A virtual machine is a machine that, to software executing on the virtual machine, appears to be a physical machine. The software may save files in a virtual storage device such as virtual hard drive, virtual floppy disk, and the like, may read files from a virtual optical device, may communicate via a virtual network adapter, and so forth.
  • More than one virtual machine may be hosted on a single computer. That is, two or more virtual machines may execute on a single physical computer. To software executing in each virtual environment, the virtual environment appears to have its own resources (e.g., hardware) even though the virtual machines hosted on a single computer may physically share one or more physical devices with each other and with the hosting operating system.
  • Turning to FIG. 2, the system 205 may include a claims provider 210, a user agent service 211, a user device 212, a relying party 213, and other entities (not shown). As used herein, the term entity is to be read to include all or a portion of one or more devices, a service, a collection of one or more software modules or portions thereof, some combination of one or more software modules or portions thereof and one or more devices or portions thereof, and the like.
  • A service may be implemented using one or more processes, threads, components, libraries, and the like that perform a designated task. A service may be implemented in hardware, software, or a combination of hardware and software. A service may be distributed over multiple devices or may be implemented on a single device.
  • A user may seek to access goods, services, data, resources, or other information provided by the relying party 213. The user may be a natural person or persons, a computer, a network, any other entity, or the like.
  • Sometimes herein, the term “services” is used to indicate something that the relying party may provide. A service may include data, resources, or other information that may be provided by the relying party 213.
  • Furthermore, sometimes the term “user” is used to reference the entity that is seeking to access the services of the relying party 213. It is to be understood, however, that the term user may include a natural person or persons, one or more computers, networks, other entities, combinations of two or more of the above, or the like.
  • Before providing access to a service, the relying party 213 may require one or more verified claims regarding the user seeking access to the service. Verified claims may be issued by one or more claims providers that the relying party 213 trusts. A claims provider may authenticate a user before providing a claim. A claims provider may “partially” sign issued claims to provide evidence that the claims provider issued the claims.
  • The term “partially” sign is used to indicate that a claims provider may provide data that allows the user device 212 to complete the signature of the claims. The user device 212 completes the signature by using a function or data passed to the user device 212 by the claims provider. This is done, in part, so that the claims provider cannot later collude with the relying party 213 or any other entity to identify the user for whom the claims are provided. In addition, the function or data passed to the user device 212 may be tied to claims such that it cannot be used to sign other claims.
  • In some examples, a relying party may be any resource, privilege, or service that requires one or more valid claims to enter, access, or use. For example, a relying party may include: a computer, a computer network, data, a database, a building, personnel, services, companies, organizations, physical locations, electronic devices, or any other type of resource.
  • In one embodiment, a claim is an assertion of the truth of something. For example, a claim may convey an identifier (e.g., student number, logon name, other identifier, or the like). A claim may assert that the user knows a given key (e.g., a response to a challenge, a password, or the like). Claims may convey personally identifying information such as name, address, data of birth, citizenship, and the like. A claim may assert that a user is part of a certain group. For example, a claim may indicate that a user is over 18 years old. As another example, a claim may indicate that a user is part of a group with specific permissions.
  • Some familiar examples of types of claims include: first name, last name, email address, street address, locality name or city, state or province, postal code, country, telephone number, social security number, date of birth, gender, personal identifier number, credit score, financial status, legal status, and the like. It will be understood, however, that the above types of claims are exemplary only and that those skilled in the art may utilize other claims without departing from the spirit or scope of aspects of the subject matter described herein.
  • Sometimes, a claims provider may be referred to as an identity provider in the sense that the claims provider may provide something that identifies a characteristic of the user. In some implementations, one or more claims providers may reside on a network or in the cloud. The cloud is a term that is often used as a metaphor for the Internet. It draws on the idea that computation, software, data access, storage, and other resources may be provided by entities connected to the Internet without requiring users to know the location or other details about the computing infrastructure that delivers those resources.
  • In some implementations, one or more claims providers may reside on the user device 212. For example, the user device 212 may host a health claims provider, a biometric claims provider, other claims providers, and the like. In some implementations, one or more claims providers may be part of a trust framework that a relying party trusts.
  • To help protect against deception, the claims provider 210 may obtain data from the user, the user device 212, and/or a physical device of the user. For example, the claims provider 210 may ask one or more challenge questions to which the user must respond, receive a PIN, password, or other user-known data from the user, obtain, with consent, biometric data (e.g., fingerprint, retina, DNA, or other biometric data), receive a code from a portable item (e.g., a USB key, smart card, or the like), obtain other credentials, a combination of two or more of the above, and the like.
  • If the claims provider 210 is satisfied that the entity requesting a claim is the user, the claims provider 210 may provide the user with data that has been signed by the claims provider 210 in such a way as to make it difficult or infeasible to change or forge the data. This data may include or be linked to one or more claims. The user device 212 may present the data to the relying party 213 to provide one or more claims. If the data includes more claims than is required by the relying party 213, the user may present just the required claims while still presenting the signature or other data that indicates that the data has not been tampered with.
  • A user may not want the claims provider 210 to know the relying parties with which the user is interacting. The user may also not want the relying party to know any more information about the user than is necessary for interacting with the relying party. To avoid these undesirable events, privacy boundaries may be erected. A privacy boundary ensures that certain data is not transmitted across the boundary. For example, a privacy boundary may be erected between the user and the relying party 213 and another privacy boundary may be erected between the user and the claims provider 210. While the user device 212 may have access to all data included in both boundaries, data inside one privacy boundary may not be allowed to pass to an entity outside the privacy boundary without user consent.
  • To avoid the claims provider 210, the relying party 213, or both from tracking the interactions of the user or otherwise gathering data about the user, certain actions may be taken. For example, in one implementation, when a user device 212 seeks to access a service or resource of a relying party, the following actions may occur:
  • 1. The user device 212 may contact the relying party 213. For example, if the user device 212 is hosting a Web browser, a user may type an address of a relying party in an address bar of the Web browser. In response, the Web browser may send a request to the relying party 213.
  • 2. In response to the request, the relying party 213 may redirect the user device 212 and provide a document 221 (e.g., an HTML, XML, or other document). The document 221 may include a reference 226 to the user agent service 211. The document 221 may also include a set of claims required by the relying party 213 in order for the relying party 213 to provide the requested information. The document 221 may include a list of claims providers 210 and/or a trust framework that the relying party 213 trusts.
  • 3. In one embodiment, the user device 212 may then use the document 221 to request code from the user agent service 211. Sometimes the user agent service 211 may be referred to as a policy service. The code that is downloaded from the user agent service 211 is sometimes referred to as the downloaded user agent 215.
  • In another embodiment, the code is not requested from the user agent service 211. Instead, the code is already on the user device 212 having been previously downloaded. In this embodiment, the relying party 213 may redirect the user to the user agent executing on the user device 212. The relying party 213 may also provide the document 221 to the user agent already on the user device 212.
  • In yet another embodiment, the user agent may execute as a service in the cloud. In this embodiment, the document from the relying party 213 may be provided to the user agent which may then drive obtaining claims to satisfy the relying party 213. In this embodiment, the downloaded user agent 215 refers to code that runs in the cloud. The code may have been downloaded to the cloud by the user, the user's company, or another entity that the user trusts.
  • 4. In response to the request, the user agent service 211 may provide code to the user device 212. The code may implement a protocol the user device 212 may follow to obtain signed claims from the claims provider 210 and provide these claims to the relying party 213 without revealing data that may be used to track the user device 212. The code may be executed on the user device 212 and is sometimes referred to herein as the downloaded user agent 215. In another embodiment, the code may be executed in the cloud and the code is not downloaded to the user device 212.
  • 5. The downloaded user agent 215 may evaluate the document 221 to determine the claims required by the relying party 213. The downloaded user agent 215 may also evaluate the document 221 to determine what claims providers and/or trust framework the relying party 213 trusts.
  • 6. If the claims are already stored on the user device 212 (e.g., in a cache) or on a separate device (e.g. a smart card) and the claims are such that they may be used to respond to the relying party 213, the stored claims may be presented to the relying party. For example, although the claims may be stored on the user device or attached storage device, they may be restricted such that they are not allowed to be provided to the relying party 213. For example, the claims may be restricted by time, number of disclosures to relying parties, to certain relying parties only, or the like. Claims may be stored in the form of tokens where a token may include one or more claims and an electronic signature of a claims provider.
  • 7. In one embodiment, if any claims required by the relying party 213 are missing from the user device 212, the downloaded user agent 215 may redirect the user to a claims provider to obtain a new set of claims from the claims provider 210. In another embodiment, for any claims that are not already stored on the user device 212 or that are stored but are not usable to respond to the relying party 213, the downloaded user agent 215 may then interact with the claims provider 210 to obtain signed claims that may be presented to a token provider to obtain a token to provide to the relying party 213 to obtain the service. Obtaining the signed data may involve communicating with a protocol gateway and providing data that may be used to authenticate the user.
  • The protocol gateway may communicate with an access control service that is federated and can provide authentication across multiple companies. The access control service may communicate with one or more identity providers to obtain claims requested by the user device 212. After receiving claims, the access control service may return the claims to the protocol gateway or some other entity which may sign and return the claims to the user device 212.
  • Signing the data may be done in such a way that the claims provider 210 does not see the key by which the data is signed. This may be done as indicated previously by providing the user device 212 with a function or other data that allows the user device 212 to complete the signing the claims with a key of the user device 212. This may be done, for example, to avoid allowing the claims provider 210 to track the interactions of the user either by itself or even working together with the relying party 213. The claims provider 210 may provide the claims to the downloaded user agent 215 via the document 220. The document 220 may also include a reference 225 to the user agent service 211. In browsers, including this link allows the downloaded user agent 215 to store and maintain state obtained from the relying party and the claims provider without invoking security problems (e.g., cross-site scripting).
  • 8. After obtaining signed claims, the downloaded user agent 215 may provide any claims required by the relying party 213 to the relying party 213 together with proof of signature. In one implementation, the downloaded user agent 215 may first communicate with a token issuer to create a token that may be provided to the relying party 213. This token may be created such that the token issuer is not aware of the relying party 213 to which the token will be presented. This token may also be partially signed by the token issuer with a completed signature generated by the user device 212.
  • 9. Upon receipt of the signed claims, the relying party 213 may verify that the claims are validly signed and that the claims are sufficient for the requested service. In one implementation, the relying party 213 may validate the claims itself. In this implementation, the relying party may also have the means to decrypt the token. Alternatively, the relying party may consult a validation service to verify that the claims are validly signed. For privacy, one or more of the claims in the token may be obscured from the relying party 213 and the validation service.
  • 10. If the claims are validly signed and sufficient for the requested service, the relying party 213 may provide the requested service to the user device 212.
  • Based on the teachings herein, those skilled in the art may recognize other mechanisms or procedures for accomplishing avoiding tracking the interactions of the user or otherwise gathering data about the user. These other mechanisms or procedures may also be used without departing from the spirit or scope of aspects of the subject matter described herein
  • For implementation with Web browsers, the claims provider 210 and the relying party 213 may provide Web documents (e.g., the documents 220 and 221) to a browser of the user device 212. The documents may include a reference to the same user agent service 211 and may also include other data needed by the user device 212. Having the reference to the same user agent service 211 may allow the Web browser of the user device 212 to maintain the state data between the claims provider 210 and the relying party 213 since the state is held in the same browser session.
  • When the claims provider 210 sends a document to the user device 212, the document may include a reference to the user agent service 211 and claims. The user device 212 may use the reference to send a request for code to the user agent service 211. If the code has already been downloaded to the user device 212 and cached thereon, the user device 212 may detect this and forgo downloading the code again.
  • In requesting the code from the user agent service 211, in one embodiment, the user device 212 may avoid sending the claims or other user-identifying information to the user agent service 211. In this manner, the user device 212 may avoid disclosing identifying information to the user agent service 211 that would allow the user agent service 211 to track the activity of the user device 212.
  • The downloaded user agent 215 obtained from the user agent service 211 and executed on the user device 212 may access the other data of the document as needed to obtain claims and a partial signature from the claims provider 210. The downloaded user agent 215 may also provide the obtained claims or a subset thereof as appropriate to the relying party 213. In obtaining the claims and signature from the claims provider 210 and providing claims and a signature to the relying party 213, the downloaded user agent 215 may store state in a browser specific memory of the user device 212. The state is obtained from interactions between the user device 212 and the relying party 213 and the user device 212 and the claims provider 210. This state may be stored for and in the context of a Web browser session and deleted in conjunction with the Web browser closing.
  • Turning to FIG. 3, the system 205 may include a claims provider 210, a user agent service 211, a user device 212, a relying party 213, and other entities (not shown). The relying party 213 may employ a validating service 305 to validate claims provided by the user device 212. The validating service 305 may include validating methods to verify that the claims have not been tampered with subsequent to their being signed. This may involve digital signature verification techniques as is known to those skilled in the art.
  • In one embodiment, the validating service 305 may be able to decrypt the claims provided by the user device 212. In another embodiment, the validating service 305 may not be able to decrypt the claims provided by the user device 212. With various signature validation techniques, validation may be performed in either embodiment.
  • Being able to call on the validating service 305 to validate the claims of the user frees the relying party 213 from the task of implementing validation code. In some cases, however, the relying party 213 may wish to implement its own validation code and place it on a component of the relying party 213. Having the validation code as a part of the relying party 213 is within the spirit and scope of aspects of the subject matter described herein.
  • Having the validating service 305 does not compromise the privacy boundary established around the claims provider 210. The validating service 305 receives at most the claims that the relying party 213 received. As such, the validating service 305 does not obtain any more information than the relying party 213 obtained from the user. If the information received by the relying party 213 is not sufficient to allow the relying party 213 to determine the additional information about the user, it is likewise insufficient for the validating service 305 to determine the additional information of the user.
  • The validating service 305 may be connected to the billing service 310. The billing service 310 may charge relying parties who rely on a claims provider to authenticate their users. Billing based on tracking which individual users consume specific services may be inappropriate and constitute a privacy concern for the relying party and/or the individual user. The billing service 310 may calculate a total usage of claims from claims providers by relying parties without tracking which individual users employ which relying parties. Indeed, with the limited information provided by the user device 212 across the privacy boundaries and subsequently to the billing service 310, the billing service 310 may not have sufficient information to identify a user of the user device 212. This enables a viable business model for claims providers while mitigating privacy concerns.
  • In one example, a relying party may remove or obscure claims from a token before sending the claims to the validation service to preserve privacy of the user and/or the relying party. With various encryption techniques, removing or obscuring the claims may be performed in such a way that it does not break the signature of the token.
  • Because the billing service 310 is also outside of the privacy boundary that includes the claims provider 210, the billing service 310 may not determine any more information about the user than the relying party 213 and the validating service 305. The billing service 310 may be informed of the claims provider that provided the claims. This may be determined, for example, via an identifier and/or signature conveyed in conjunction with the claims. The billing service 310 may also be informed of the relying party 213. This may be done, for example, explicitly via the validating service 305 or the relying party 213, via a token that encapsulates the claims, via an address (e.g., URL, domain name, or the like) associated with the relying party 213, or in other ways. Using this information, the billing service may maintain a count associated with each claims provider and each service provider and may bill service providers and/or claims providers based on the count.
  • Turning to FIG. 4, the system 405 may include the claims provider 210, the user agent service 211, the relying party 213, the user device 212, and other entities (not shown). The claims provider 210, the user agent service 211, the relying party 213, and other entities may be placed in the cloud 410. In alternative examples, one or more of the claims provider 210, the user agent service 211, and the relying party 213 may be placed in the cloud 410 and one or more of them may be located outside the cloud 410.
  • Even when one or more of the entities illustrated in FIGS. 2 and 3 are placed into the cloud and even if all the entities (except the user device 212) are under the control of a single entity, privacy may be maintained via the techniques mentioned previously. Based on the teachings above, through interaction with the user device 212 or even colluding with the relying service 213, without additional information, the claims provider 210 may not able to determine the relying party 213 for which the user is seeking signed claims. Furthermore, without additional information, the relying party 213 may not be able to determine additional information usable to determine the natural identity of the user. At the same time, however, an entity that owns the service providers can still control access to the services as desired.
  • A relying party may be willing to accept signed claims from any of a number of claims providers. In one embodiment, an array of claims providers may be presented to the user. In another embodiment, claims providers for users may be registered in directories, but this may be intrusive if the user participates or present privacy problems if the user does not.
  • As yet another embodiment, profile data stored on a user device may be used to predict claims providers with which a user is likely familiar. FIG. 5 is a block diagram representing an exemplary arrangement of components of a user device in which aspects of the subject matter described herein may operate. The components illustrated in FIG. 5 are exemplary and are not meant to be all-inclusive of components that may be needed or included. In other embodiments, the components and/or functions described in conjunction with FIG. 5 may be included in other components (shown or not shown) or placed in subcomponents without departing from the spirit or scope of aspects of the subject matter described herein. In some embodiments, the components and/or functions described in conjunction with FIG. 5 may be distributed across multiple devices. For example, profile data may be on a removable storage device or on the user device.
  • Turning to FIG. 5, the user device 505 may include identity components 510, a store 520, a communications mechanism 225, and other components (not shown). The user device 505 may comprise one or more computing devices. Such devices may include, for example, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microcontroller-based systems, set-top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, cell phones, personal digital assistants (PDAs), gaming devices, printers, appliances including set-top, media center, or other appliances, automobile-embedded or attached computing devices, other mobile devices, distributed computing environments that include any of the above systems or devices, and the like.
  • Where the user device 505 comprises a single device, an exemplary device that may be configured to act as the user device 505 comprises the computer 110 of FIG. 1. Where the user device 505 comprises multiple devices, each of the multiple devices may comprise a similarly or differently configured computer 110 of FIG. 1.
  • The identity components 210 may include a browser 515, a user agent 516, a profile manager 517, and other components (not shown). As used herein, the term component is to be read to include all or a portion of a device, a collection of one or more software modules or portions thereof, some combination of one or more software modules or portions thereof and one or more devices or portions thereof, and the like.
  • The communications mechanism 525 allows the user device 505 to communicate with other entities. For example, the communications mechanism 525 may allow the user device 505 to communicate with other entities described in conjunction with FIGS. 2-4. The communications mechanism 525 may be a network interface or adapter 170, modem 172, telephone network interface, or any other mechanism for establishing communications as described in conjunction with FIG. 1.
  • The store 520 is any storage media capable of providing access to profile data. Access as used herein may include reading data, writing data, deleting data, updating data, a combination including two or more of the above, and the like. The store may include volatile memory (e.g., RAM, an in-memory cache, or the like) and non-volatile memory (e.g., a persistent storage).
  • The term data is to be read broadly to include anything that may be represented by one or more computer storage elements. Logically, data may be represented as a series of 1's and 0's in volatile or non-volatile memory. In computers that have a non-binary storage medium, data may be represented according to the capabilities of the storage medium. Data may be organized into different types of data structures including simple data types such as numbers, letters, and the like, hierarchical, linked, or other related data types, data structures that include multiple other data structures or simple data types, and the like. Some examples of data include information, program code, program state, program data, other data, and the like.
  • The store 520 may comprise hard disk storage, other non-volatile storage, volatile memory such as RAM, other storage, some combination of the above, and the like and may be distributed across multiple devices. The store 520 may be external, internal, or include components that are both internal and external to the user device 505.
  • Profile data is data that may be used to determine a set of candidate claims providers with which a user may have a relationship. Profile data may include data from various sources. For example, profile data may include a history of pages browsed by the browser 515, a history of a previous claims provider selections, cookie data, files, cache, or other data on the user device 505, external or local services. Such a profile data may give an indication of claims providers with which a user has established relationships. The profile data may be mined in a privacy-preserving way to discover these potential claims providers.
  • When a user needs to select claims provider(s) to respond to a relying party, the user may be presented with a list of claims providers mined from the user's profile data. The profile manager 517 may mine the profile data stored on the store 520 and may match the identified claims providers with the claims providers acceptable to a relying party. Identified claims providers may be stored on the store 520 for subsequent use.
  • In addition to the claims providers found in the profile data, the identity components 510 may also present the user with a link or other user interface element that allows the user to view all the claims providers acceptable to the relying party.
  • FIG. 6 is a block diagram that represents an exemplary environment in which claims from multiple claims providers are obtained to satisfy a relying party in accordance with aspects of the subject matter described herein. In some cases it may be desirable to obtain claims from multiple claims providers. For example, some relying parties may require added claims for proof. For example, being able to prove that the user has a user name and logon password to one service may not be sufficient. To satisfy a relying party, the user may also need to prove, for example, that the user knows an address, telephone number, or government issued identifier and that the user possesses some physical item. One claims provider may not have sufficient data to provide claims for all that the relying party 213 requires.
  • To address this issue, the relying party 213 may specify the claims that are needed and allow the user device 212 to satisfy these claims by using multiple claims providers 210. The relying party 213 may indicate the claims required in a single interaction or may request certain claims for certain services and may later request additional claims when the user seeks to access other services (e.g., those that are more restrictive, sensitive, costly, or the like).
  • The downloaded user agent on the user device 212 may obtain claims from multiple claims providers (e.g., the claims providers 610-612). The user device 212 may send these claims to the relying party 213 without combining them all into a single token. The relying party 213 may send the claims to the validating service 615 which may validate the claims in whatever combinations they are sent, if any, and return an overall validation if all the claims are validated. For example, if the claims provider 610 provides 2 claims, the claims provider 611 provides 1 claim, and the claims provider 612 provides 3 claims, the validating service 615 may validate the 2 claims together, the 1 claim by itself, and the 3 claims together and return a message that the claims are all valid. The validating service 615 may also indicate which sets of claims are valid and which are not.
  • In one implementation, each set of claims may be provided in a separate token where each token has its own separate signature. In this implementation, 3 tokens may be provided to the validating service 615, where the one token includes 2 claims, one token includes 1 claim, and one token includes 3 claims. The validating service 615 may validate the claims in each token and return multiple messages that indicate which tokens and claims are valid and/or a single message that the claims are all valid after all claims have been validated. In one embodiment, the tokens may be received by the validating service 615 in any order and they may be received at different times.
  • In one implementation, the claims from the separate claims providers may be combined into one token that is signed or partially signed by a token issuer. The token with the combined claims may then be sent to the relying party.
  • In one implementation, a claims provider may demonstrate that the user has possession of a physical device such as a telephone or computer or has knowledge of specific data such as a One Time Password (OTP) sent to a cell phone. When a specific service is requested, the relying party may require the user to supplement claims provided by one claims provider with claims about possession of a physical device or knowledge made by another independent claims provider.
  • FIGS. 7-8 are timing diagrams in accordance with aspects of the subject matter described herein. Illustrated across the top of the timing diagrams are entities and privacy boundaries that are involved with the actions indicated by the timing diagrams. In particular, there are illustrated a claims provider 705, an access control service 710, a protocol gateway 715, a token issuer 720, a privacy boundary 725, a user agent 730, a privacy boundary 732, a user agent service 735, a validation service 740, and a relying party 745.
  • At 0, a user may attempt to access a service of a relying party 745. For example, using a browser hosted by the user device 730, a user may send a request such as http://domainname/servicename/logon.
  • At 1 a, the relying party 745 may redirect the user with a message. The message may include a redirection document that includes a reference to the user agent service 735.
  • At 1 b, the user device 730 may download the user agent from the user agent service 735. User agents may be written in a variety of languages, frameworks, and platforms. Some exemplary user agents may be written in MICROSOFT® SILVERLIGHT®, ADOBE® FLASH®, HTML5, JAVASCRIPT®, and the like. Other exemplary user agents may include code that may execute outside a browser. For example, a user agent may execute as client code on the user device 730, as a phone application, as a service in the cloud, or the like.
  • At 2 a, actions occur on the user device 730. For example, the downloaded user agent may determine if a token is already available on the user device suitable for providing claims to the relying party 745. If so, that token may be provided to the relying party 745. Otherwise, an identity provider may selected by a user.
  • Context information may be passed with messages. For example, context information is an optional parameter that may be passed in an HTTP get and post. For example, context information may be passed with the redirect message from the relying party. The context information may include information about one or more of which, how, or where the attributes for the entity identified in the digital identity representation will be used. Context may indicate language and/or region or may be a role for which the user will be authenticated.
  • The context and document obtained from the relying party 745 (or data derived therefrom) may be stored in a memory space reserved and isolated for the downloaded user agent. The user may be shown a consent user interface element and allowed to select the identity provider from which to obtain the claims.
  • At 3 a, the process of obtaining the claims may be initiated with a message to the protocol gateway 715. For example, the downloaded user agent may send a document which includes the requested claims and context but from which the relying party information has been removed.
  • At 3 b, in response, the policy gateway 715 may send a cookie to the user device 730. For example, one form of a cookie may be write cookie(context, stripped policy, identity provider, user agent). A stripped policy is a policy that has been stripped of information identifying the relying party 745.
  • At 3 c, the protocol gateway 715 may send a message to the access control service 710.
  • At 3 d, the access control service 710 may send a message to the claims provider 705.
  • At 3, the user may authenticate with the claims provider 705. For example, the user may enter a username and password on a user interface on the user device 730.
  • At 3 e, the claims provider 705 may return claims in the form of a response document.
  • At 3 f, the access control service 710 may return claims to the protocol gateway 715 in the form of a response document.
  • Turning to FIG. 8, at 3 g, the protocol gateway 715 may read the cookie previously written to the user device 730. The read may take the form of read cookie(context, stripped policy, identity provider, user agent).
  • Either 3 h 1 or 3 h 2 may be used to return signed claims. At 3 h 1, signed claims may be posted back to the user agent. At 3 h 2, signed claims may be loaded by the user agent using a command.
  • At 4 a, the reference (URL) obtained in 3 h 1 or 3 h 2 may be used to download the user agent from the user agent 735.
  • At 4 b, tokens may be generated by interacting with the token issuer 720. One form of interaction may include a command such as Generate Tokens(context, Signed Claims, stripped URL).
  • At 2 b, actions are taken on the user device 730. Some exemplary actions include generating and storing tokens on memory space reserved and isolated to the downloaded user agent and finding a policy with a reply address in the memory space set up previously at 2 a. For example, the downloaded user agent may see if appropriate tokens are available for responding to the relying party 745. If not, the actions may continue at 2 a of FIG. 7. Otherwise, presentation proof may be generated for the relying party 745.
  • At 5 a, the presentation proof is provided to the relying party 745. For example, presentation proof may be provided by sending a message that includes a signed token.
  • At 6 a, the relying party 745 may send a message to the validation service 740 to validate the presentation proof.
  • At 6 b, the validation service 740 may respond with a message that indicates whether the proof is valid. For example, the validation service 740 may respond with a message which indicates whether the claims were valid or not.
  • Other actions, not indicated may also be performed without departing from the spirit or scope of aspects of the subject matter described herein.
  • FIGS. 9-10 are flow diagrams that generally represent exemplary actions that may occur in accordance with aspects of the subject matter described herein. For simplicity of explanation, the methodology described in conjunction with FIGS. 9-10 is depicted and described as a series of acts. It is to be understood and appreciated that aspects of the subject matter described herein are not limited by the acts illustrated and/or by the order of acts. In one embodiment, the acts occur in an order as described below. In other embodiments, however, the acts may occur in parallel, in another order, and/or with other acts not presented and described herein. Furthermore, not all illustrated acts may be required to implement the methodology in accordance with aspects of the subject matter described herein. In addition, those skilled in the art will understand and appreciate that the methodology could alternatively be represented as a series of interrelated states via a state diagram or as events.
  • FIG. 9 is a flow diagram that generally represents exemplary actions that may occur in collecting and using billing information in accordance with aspects of the subject matter described herein. At block 905, the actions begin.
  • At block 910, a message is received that identifies a claims provider that provided a claim but that does not identify a user using the claim to access a service. For example, referring to FIG. 3, the billing service 310 may receive a message that indicates that the claims provider 210 provided a claim that has been presented to the relying party 213. The message does not identify a user using the claim to access the service provided by the relying party. For example, the message may not be used by itself or in conjunction with other data maintained by the claims provider 210 and/or the relying party 213 to identify a natural identity of a user who presented the claim.
  • A natural identity is information that may be used to identify a user in other contexts. For example, a social security number and a birth date may be sufficient to identify an individual in another unrelated transaction while a temporary number assigned to the individual may not be sufficient.
  • In one implementation, the received message may include the claim itself. The claim may include an identifier of the user where the identifier has been obscured (e.g., by encryption). In another implementation, the received message may include a token that includes the claim. Natural identity information may be removed from the token, obscured, not placed in the token in the first place, or the like.
  • In one implementation, the received message may include data that identifies the claims provider that provided the claim but that does not include the claim. In another implementation, the received message may include the claim and data that identifies the claims provider, but the received message does not include an identifier of the user.
  • At block 915, an indication is obtained of a relying party that relied on the claim in conjunction with providing access to a service. For example, referring to FIG. 3, the billing service 310 may obtain an identifier of the relying party 213.
  • In one implementation, the relying party may be determined by a lookup of a network address of an entity (e.g., the relying party) that provided the claim to a validation service. In another implementation, the relying party may be determined by authenticating the entity (e.g., the relying party) that sent the claim to the validating service. In another implementation, a component of the relying party may inform the billing service and provide an identifier of the relying party and an identifier of the claims provider each time a claim from the claims provider is relied on by the relying party.
  • At block 920, a count is updated that indicates a number of claims provided by the claims provider that the relying party has relied on. The count does not include information to identify individual users who have provided claims to the relying party. For example, referring to FIG. 3, the billing service 310 may update a variable to indicate how many times a relying party has relied on claims provided by the claims provider.
  • At block 925, other actions, if any may be performed. For example, the count described above may be provided to a billing component for billing the relying party for claims provided by the claims provider and relied on by the relying party. This is represented by block 930.
  • FIG. 10 is a flow diagram that generally represents exemplary actions that may occur in providing billing information to a billing service in accordance with aspects of the subject matter described herein. At block 1005, the actions begin.
  • At block 1010, a claim is received that identifies a claims provider but that does not include identity information usable to identify a natural identity of a user using the claim to access a service.
  • At block 1015, billing information is provided to a billing service. The billing information includes a relying party identifier and a claims provider identifier. The relying party identifier identifies a relying party that relied on the claim to provide access to a service. The claims provider identifier identifies a claims provider that provided the claim. The billing information is not usable either by itself and/or with other information available to the claims provider and/or the relying party to determine a natural identity of a user using the claim to obtain access to the service of the relying party.
  • For example, referring to FIG. 3, in one implementation, the validating service 305 may provide billing information to the billing service 310. In another implementation, the relying party 213 or another entity may provide billing information to the billing service 310.
  • Providing the billing information may include providing the claim itself where the identity information has been obscured by encryption, removed from the claim by a privacy boundary, for example, or the like. Providing the billing information may include sending a data structure that indicates the relying party and the claims provider together with other information, if any, that is desired or appropriate.
  • Although some of the discussion above has referenced data passed via HTTP, the same techniques may also be applied with other technologies without departing from the spirit or scope of aspects of the subject matter described herein. For example, messages and data may be transferred between the different entities illustrated in FIGS. 7 and 8 through the use of the Simple Object Access Protocol (SOAP), Simple Mail Transfer Protocol (SMTP), Asynchronous JavaScript and XML (AJAX) techniques, Transmission Control Protocol (TCP)/Internet Protocol (IP) or other protocols to transfer data over networks, XML, a combination of two or more of the above, or the like.
  • Although some of the above discussion has referenced Web browsers, the same techniques may also be applied to environments without Web browsers without departing from the spirit or scope of aspects of the subject matter described herein. For example, the functions of the code may be implemented by applications of mobile devices, software that is installed on computers, chips that implement one or more functions of the code, and the like. Furthermore, it is contemplated that Web browsers may be modified to incorporate some or all of the code downloaded from the user agent service 211 such that none or only some of the code may need to be downloaded from the user agent service 211.
  • As can be seen from the foregoing detailed description, aspects have been described related to identity technology. While aspects of the subject matter described herein are susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit aspects of the claimed subject matter to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of various aspects of the subject matter described herein.

Claims (20)

    What is claimed is:
  1. 1. A method implemented at least in part by a computer, the method comprising:
    receiving a message that identifies a claims provider that provided a claim but that does not identify a user using the claim to access a service;
    obtaining an indication of a relying party that relied on the claim in conjunction with providing access to the service;
    updating a count that indicates a number of claims provided by the claims provider that the relying party has relied on, the count not identifying the user; and
    providing the count to a billing component for billing the relying party for claims provided by the claims provider and relied on by the relying party.
  2. 2. The method of claim 1, wherein receiving the message that identifies a claims provider that provided the claim comprises receiving the claim itself.
  3. 3. The method of claim 2, wherein receiving the message that identifies a claims provider that provided the claim comprises receiving the claim, the claim including an identifier of the user, the identifier having been obscured.
  4. 4. The method of claim 1, wherein receiving the message that identifies a claims provider that provided the claim comprises receiving a token that includes the claim.
  5. 5. The method of claim 4, further comprising removing natural identity information from the token prior to the receiving the token that includes the claim.
  6. 6. The method of claim 1, wherein receiving the message that identifies a claims provider that provided the claim comprises receiving data that identifies the claims provider that provided the claim but that does not include the claim.
  7. 7. The method of claim 1, wherein receiving a message that identifies a claims provider that provided a claim but that does not identify a user using the claim includes receiving a message that includes an identifier of the claims provider but does not include an identifier of the user.
  8. 8. The method of claim 1, wherein obtaining an indication of a relying party that relied on the claim in conjunction with providing access to the service comprises a lookup of a network address of an entity that provided the claim to a validating service.
  9. 9. The method of claim 1, wherein receiving a message that identifies a claims provider that provided a claim comprises receiving a claim that indicates a characteristic of the user without revealing a natural identity of the user.
  10. 10. In a computing environment, a system, comprising:
    a billing service hosted on one or more computers, the billing service configured to perform actions, including:
    obtaining data that identifies a claims provider that provided a claim but that does not identify a user using the claim to access a service;
    obtaining an indication of a relying party that relies on the claim in conjunction with providing access to the service;
    updating a count that indicates a number of claims provided by the claims provider that the relying party has relied on, the count not identifying the user; and
    using the count for billing the relying party for claims provided by the claims provider and relied on by the relying party.
  11. 11. The system of claim 10, further comprising a validation service configured to verify that the claim is validly signed.
  12. 12. The system of claim 11, wherein the validation service is hosted on one or more computers controlled by the relying party.
  13. 13. The system of claim 11, wherein the validation service is hosted on one or more computers outside of control of the relying party.
  14. 14. The system of claim 10, further comprising a user device configured to erect a privacy boundary to prevent transmission of natural identity information to the billing service.
  15. 15. The system of claim 14, wherein the user device is further configured to encrypt natural identity information in the claim prior to sending the claim to the billing service.
  16. 16. The system of claim 14, wherein the user device is further configured to remove natural identity information in the claim prior to sending the claim to the billing service.
  17. 17. The system of claim 14, wherein the user device is further configured to download and execute user agent code in response to redirection data received from the relying party, the user agent code including instructions to erect the privacy boundary.
  18. 18. The system of claim 10, wherein the one or more computers hosting the billing service provide the billing service as a service reachable to entities connected to the Internet.
  19. 19. A computer storage medium having computer-executable instructions, which when executed perform actions, comprising:
    receiving a claim that identifies a claims provider but that does not include identity information usable to identify a natural identity of a user using the claim to access a service; and
    providing, to a billing service, billing information that includes a relying party identifier and a claims provider identifier, the relying party identifier identifying a relying party that relied on the claim to provide access to the service, the claims provider identifier identifying the claims provider, the billing service configured to update a count that indicates a number of claims provided by the claims provider that the relying party has relied on and to bill based on the count, the billing information not usable to determine a natural identity of the user.
  20. 20. The computer storage medium of claim 19, wherein providing billing information comprises providing the claim, the claim having the identity information obscured via encryption of the identity information by a device under control of the user.
US13652478 2012-04-17 2012-10-16 Anonymous billing Abandoned US20130275282A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201261625641 true 2012-04-17 2012-04-17
US13652478 US20130275282A1 (en) 2012-04-17 2012-10-16 Anonymous billing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13652478 US20130275282A1 (en) 2012-04-17 2012-10-16 Anonymous billing

Publications (1)

Publication Number Publication Date
US20130275282A1 true true US20130275282A1 (en) 2013-10-17

Family

ID=49325960

Family Applications (5)

Application Number Title Priority Date Filing Date
US13652478 Abandoned US20130275282A1 (en) 2012-04-17 2012-10-16 Anonymous billing
US13655436 Active US8973123B2 (en) 2012-04-17 2012-10-18 Multifactor authentication
US13682743 Active 2032-12-20 US8752158B2 (en) 2012-04-17 2012-11-21 Identity management with high privacy features
US13688210 Active US9571491B2 (en) 2012-04-17 2012-11-29 Discovery of familiar claims providers
US13705179 Active 2033-02-05 US8806652B2 (en) 2012-04-17 2012-12-05 Privacy from cloud operators

Family Applications After (4)

Application Number Title Priority Date Filing Date
US13655436 Active US8973123B2 (en) 2012-04-17 2012-10-18 Multifactor authentication
US13682743 Active 2032-12-20 US8752158B2 (en) 2012-04-17 2012-11-21 Identity management with high privacy features
US13688210 Active US9571491B2 (en) 2012-04-17 2012-11-29 Discovery of familiar claims providers
US13705179 Active 2033-02-05 US8806652B2 (en) 2012-04-17 2012-12-05 Privacy from cloud operators

Country Status (1)

Country Link
US (5) US20130275282A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8806190B1 (en) 2010-04-19 2014-08-12 Amaani Munshi Method of transmission of encrypted documents from an email application
US8892697B2 (en) * 2012-07-24 2014-11-18 Dhana Systems Corp. System and digital token for personal identity verification
EP2932680A1 (en) * 2012-12-12 2015-10-21 Interdigital Patent Holdings, Inc. Independent identity management systems
EP2979426A1 (en) * 2013-03-27 2016-02-03 Interdigital Patent Holdings, Inc. Seamless authentication across multiple entities
US9154488B2 (en) * 2013-05-03 2015-10-06 Citrix Systems, Inc. Secured access to resources using a proxy
US10033737B2 (en) * 2013-10-10 2018-07-24 Harmon.Ie R&D Ltd. System and method for cross-cloud identity matching
US20160094543A1 (en) 2014-09-30 2016-03-31 Citrix Systems, Inc. Federated full domain logon
JP6293648B2 (en) * 2014-12-02 2018-03-14 東芝メモリ株式会社 Memory device
US9998487B2 (en) 2016-04-25 2018-06-12 General Electric Company Domain level threat detection for industrial asset control system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815665A (en) * 1996-04-03 1998-09-29 Microsoft Corporation System and method for providing trusted brokering services over a distributed network
US20070204168A1 (en) * 2006-02-24 2007-08-30 Microsoft Corporation Identity providers in digital identity system
US20090204542A1 (en) * 2008-02-11 2009-08-13 Novell, Inc. Privately sharing relying party reputation with information card selectors
US7591424B2 (en) * 2006-03-30 2009-09-22 Microsoft Corporation Framework for adding billing payment types
US20090271856A1 (en) * 2008-04-24 2009-10-29 Novell, Inc. A Delaware Corporation Restricted use information cards
US20090282260A1 (en) * 2001-06-18 2009-11-12 Oliver Tattan Electronic data vault providing biometrically protected electronic signatures
US20100190469A1 (en) * 2009-01-29 2010-07-29 Qualcomm Incorporated Certified device-based accounting
US20110173105A1 (en) * 2010-01-08 2011-07-14 Nokia Corporation Utilizing AAA/HLR infrastructure for Web-SSO service charging
US8281149B2 (en) * 2009-06-23 2012-10-02 Google Inc. Privacy-preserving flexible anonymous-pseudonymous access
US8353016B1 (en) * 2008-02-29 2013-01-08 Adobe Systems Incorporated Secure portable store for security skins and authentication information
US20130103529A1 (en) * 2011-10-25 2013-04-25 Agile Equity Llc Facilitating formation of service contracts between consumers and service providers
US20130125197A1 (en) * 2008-02-29 2013-05-16 James D. Pravetz Relying Party Specifiable Format for Assertion Provider Token

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USH1944H1 (en) * 1998-03-24 2001-02-06 Lucent Technologies Inc. Firewall security method and apparatus
US7412422B2 (en) 2000-03-23 2008-08-12 Dekel Shiloh Method and system for securing user identities and creating virtual users to enhance privacy on a communication network
US7242921B2 (en) 2000-12-29 2007-07-10 Intel Corporation Anonymous electronic transactions
CN1575470A (en) 2001-10-23 2005-02-02 皇家飞利浦电子股份有限公司 Anonymous network-access method and client
US7610390B2 (en) * 2001-12-04 2009-10-27 Sun Microsystems, Inc. Distributed network identity
EP1582987A1 (en) * 2003-01-07 2005-10-05 Matsushita Electric Industrial Co., Ltd. Information delivering apparatus and information delivering method
US7146435B2 (en) 2003-11-07 2006-12-05 Hewlett-Packard Development Company, L.P. Distribution of hardware device installation and configuration software
WO2006042265A3 (en) * 2004-10-11 2007-02-01 Nextumi Inc System and method for facilitating network connectivity based on user characteristics
US7788729B2 (en) 2005-03-04 2010-08-31 Microsoft Corporation Method and system for integrating multiple identities, identity mechanisms and identity providers in a single user paradigm
US7536184B2 (en) 2005-09-29 2009-05-19 Sun Microsystems, Inc. Seamless mobility management with service detail records
US7523121B2 (en) 2006-01-03 2009-04-21 Siperian, Inc. Relationship data management
US20080028453A1 (en) 2006-03-30 2008-01-31 Thinh Nguyen Identity and access management framework
US7739744B2 (en) 2006-03-31 2010-06-15 Novell, Inc. Methods and systems for multifactor authentication
US8190883B2 (en) 2007-02-26 2012-05-29 Picup, Llc Network identity management system and method
US20080222714A1 (en) * 2007-03-09 2008-09-11 Mark Frederick Wahl System and method for authentication upon network attachment
US8370913B2 (en) 2007-03-16 2013-02-05 Apple Inc. Policy-based auditing of identity credential disclosure by a secure token service
US20080235513A1 (en) * 2007-03-19 2008-09-25 Microsoft Corporation Three Party Authentication
US20100132019A1 (en) * 2007-04-04 2010-05-27 Sxip Identity Corp. Redundant multifactor authentication in an identity management system
CA2691280C (en) * 2007-06-08 2016-07-26 Thermodynamic Design, Llc Real property information management, retention and transferal system and methods for using same
JP2010532941A (en) * 2007-06-21 2010-10-14 トムソン ライセンシングThomson Licensing Apparatus and method for use in a mobile / handheld communications system
US8132239B2 (en) * 2007-06-22 2012-03-06 Informed Control Inc. System and method for validating requests in an identity metasystem
US20080320576A1 (en) * 2007-06-22 2008-12-25 Microsoft Corporation Unified online verification service
US20100043062A1 (en) * 2007-09-17 2010-02-18 Samuel Wayne Alexander Methods and Systems for Management of Image-Based Password Accounts
US20090132813A1 (en) * 2007-11-08 2009-05-21 Suridx, Inc. Apparatus and Methods for Providing Scalable, Dynamic, Individualized Credential Services Using Mobile Telephones
US20100254489A1 (en) * 2007-11-14 2010-10-07 Thomson Licensing Code enhanced staggercasting
US8353015B2 (en) * 2008-01-09 2013-01-08 Microsoft Corporation Trusted internet identity
US20100214976A1 (en) 2008-02-06 2010-08-26 Medio Systems, Inc. Operator cloud for mobile internet services
US20100318806A1 (en) 2008-02-08 2010-12-16 Dick Hardt Multi-factor authentication with recovery mechanisms
EP2107757A1 (en) 2008-03-31 2009-10-07 British Telecommunications Public Limited Company Identity management
US7979899B2 (en) 2008-06-02 2011-07-12 Microsoft Corporation Trusted device-specific authentication
US8392469B2 (en) * 2008-06-11 2013-03-05 Microsoft Corporation Model based distributed application management
US8074258B2 (en) * 2008-06-18 2011-12-06 Microsoft Corporation Obtaining digital identities or tokens through independent endpoint resolution
US8990896B2 (en) 2008-06-24 2015-03-24 Microsoft Technology Licensing, Llc Extensible mechanism for securing objects using claims
US8910257B2 (en) * 2008-07-07 2014-12-09 Microsoft Corporation Representing security identities using claims
US8666904B2 (en) * 2008-08-20 2014-03-04 Adobe Systems Incorporated System and method for trusted embedded user interface for secure payments
US20100077450A1 (en) * 2008-09-24 2010-03-25 Microsoft Corporation Providing simplified internet access
US20100100926A1 (en) * 2008-10-16 2010-04-22 Carl Binding Interactive selection of identity informatoin satisfying policy constraints
US20100299738A1 (en) * 2009-05-19 2010-11-25 Microsoft Corporation Claims-based authorization at an identity provider
US8904519B2 (en) 2009-06-18 2014-12-02 Verisign, Inc. Shared registration system multi-factor authentication
KR20120091000A (en) * 2009-09-25 2012-08-17 구글 인코포레이티드 Controlling content distribution
KR101276201B1 (en) 2009-11-23 2013-06-18 한국전자통신연구원 Identity management server, system and method using the same
US9537650B2 (en) 2009-12-15 2017-01-03 Microsoft Technology Licensing, Llc Verifiable trust for data through wrapper composition
US9043891B2 (en) 2010-02-18 2015-05-26 Microsoft Technology Licensiing, LLC Preserving privacy with digital identities
US8566917B2 (en) 2010-03-19 2013-10-22 Salesforce.Com, Inc. Efficient single sign-on and identity provider configuration and deployment in a database system
US8973099B2 (en) * 2010-06-15 2015-03-03 Microsoft Corporation Integrating account selectors with passive authentication protocols
US8505085B2 (en) * 2011-04-08 2013-08-06 Microsoft Corporation Flexible authentication for online services with unreliable identity providers
US8661519B2 (en) * 2011-06-03 2014-02-25 Microsoft Corporation Redirection using token and value
US9800540B2 (en) * 2012-03-27 2017-10-24 Comcast Cable Communications, Llc System and method for providing services

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815665A (en) * 1996-04-03 1998-09-29 Microsoft Corporation System and method for providing trusted brokering services over a distributed network
US20090282260A1 (en) * 2001-06-18 2009-11-12 Oliver Tattan Electronic data vault providing biometrically protected electronic signatures
US20070204168A1 (en) * 2006-02-24 2007-08-30 Microsoft Corporation Identity providers in digital identity system
US7591424B2 (en) * 2006-03-30 2009-09-22 Microsoft Corporation Framework for adding billing payment types
US20090204542A1 (en) * 2008-02-11 2009-08-13 Novell, Inc. Privately sharing relying party reputation with information card selectors
US8353016B1 (en) * 2008-02-29 2013-01-08 Adobe Systems Incorporated Secure portable store for security skins and authentication information
US20130125197A1 (en) * 2008-02-29 2013-05-16 James D. Pravetz Relying Party Specifiable Format for Assertion Provider Token
US20090271856A1 (en) * 2008-04-24 2009-10-29 Novell, Inc. A Delaware Corporation Restricted use information cards
US20100190469A1 (en) * 2009-01-29 2010-07-29 Qualcomm Incorporated Certified device-based accounting
US8281149B2 (en) * 2009-06-23 2012-10-02 Google Inc. Privacy-preserving flexible anonymous-pseudonymous access
US20110173105A1 (en) * 2010-01-08 2011-07-14 Nokia Corporation Utilizing AAA/HLR infrastructure for Web-SSO service charging
US20130103529A1 (en) * 2011-10-25 2013-04-25 Agile Equity Llc Facilitating formation of service contracts between consumers and service providers

Also Published As

Publication number Publication date Type
US8973123B2 (en) 2015-03-03 grant
US20130276088A1 (en) 2013-10-17 application
US20130276131A1 (en) 2013-10-17 application
US20130276087A1 (en) 2013-10-17 application
US8806652B2 (en) 2014-08-12 grant
US8752158B2 (en) 2014-06-10 grant
US9571491B2 (en) 2017-02-14 grant
US20130275469A1 (en) 2013-10-17 application

Similar Documents

Publication Publication Date Title
Fernandez-Buglioni Security patterns in practice: designing secure architectures using software patterns
Steel et al. Core Security Patterns: Best Practices and Strategies for J2EE", Web Services, and Identity Management
Wang et al. Signing me onto your accounts through facebook and google: A traffic-guided security study of commercially deployed single-sign-on web services
US20130019289A1 (en) Online signature identity and verification in community
US20110214176A1 (en) Techniques for secure access management in virtual environments
US20120240224A1 (en) Security systems and methods for distinguishing user-intended traffic from malicious traffic
US20100037303A1 (en) Form Filling with Digital Identities, and Automatic Password Generation
US8776190B1 (en) Multifactor authentication for programmatic interfaces
US20110247045A1 (en) Disposable browsers and authentication techniques for a secure online user environment
US20130205360A1 (en) Protecting user credentials from a computing device
US8332922B2 (en) Transferable restricted security tokens
US20080010377A1 (en) Obtaining And Assessing Objective Data Ralating To Network Resources
US20140181925A1 (en) Privacy Enhanced Key Management For A Web Service Provider Using A Converged Security Engine
US20130340028A1 (en) Secure web container for a secure online user environment
US20130086381A1 (en) Multi-server authentication token data exchange
US20130290710A1 (en) System and method for a cloud-based electronic communication vault
US20130160100A1 (en) Methods and systems for increasing the security of network-based transactions
US20110154459A1 (en) Method and system for securing electronic transactions
US20110145565A1 (en) Federated authentication for mailbox replication
US20100269162A1 (en) Website authentication
US20080148377A1 (en) Management of Network Login Identities
US20140025949A1 (en) Method and system for browser identity
CN104077689A (en) Information verification method, relevant device and system
US8615794B1 (en) Methods and apparatus for increased security in issuing tokens
US8839395B2 (en) Single sign-on between applications

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BJONES, RONALD JOHN KAMIEL EUPHRASIA;CAMERON, KIM;SIGNING DATES FROM 20121011 TO 20121012;REEL/FRAME:029132/0352

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541

Effective date: 20141014