EP2962262A2 - Methods and arrangements for smartphone payments and transactions - Google Patents

Methods and arrangements for smartphone payments and transactions

Info

Publication number
EP2962262A2
EP2962262A2 EP14709848.7A EP14709848A EP2962262A2 EP 2962262 A2 EP2962262 A2 EP 2962262A2 EP 14709848 A EP14709848 A EP 14709848A EP 2962262 A2 EP2962262 A2 EP 2962262A2
Authority
EP
European Patent Office
Prior art keywords
user
information
audio
portable device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP14709848.7A
Other languages
German (de)
French (fr)
Other versions
EP2962262A4 (en
Inventor
Tony F. Rodriguez
Bruce L. Davis
Tomas FILLER
Brian T. Macintosh
Ravi K. Sharma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Digimarc Corp
Original Assignee
Digimarc Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/792,764 external-priority patent/US9965756B2/en
Priority claimed from US13/873,117 external-priority patent/US9830588B2/en
Priority claimed from US14/074,072 external-priority patent/US20140258110A1/en
Priority claimed from US14/180,277 external-priority patent/US9311640B2/en
Application filed by Digimarc Corp filed Critical Digimarc Corp
Publication of EP2962262A2 publication Critical patent/EP2962262A2/en
Publication of EP2962262A4 publication Critical patent/EP2962262A4/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3274Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being displayed on the M-device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/306Payment architectures, schemes or protocols characterised by the use of specific devices or networks using TV related infrastructures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/321Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices using wearable devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/322Aspects of commerce using mobile devices [M-devices]
    • G06Q20/3221Access to banking information through M-devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3272Short range or proximity payments by means of M-devices using an audio code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3276Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being read by the M-device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/34Payment architectures, schemes or protocols characterised by the use of specific devices or networks using cards, e.g. integrated circuit [IC] cards or magnetic cards
    • G06Q20/351Virtual cards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/36Payment architectures, schemes or protocols characterised by the use of specific devices or networks using electronic wallets or electronic money safes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the present technology concerns, e.g., portable devices such as smartphones, and their use in making secure payments or facilitating transactions.
  • shoppers should be able to select from among plural different credit cards when making purchases, and not be tied to a single payment service. Having a variety of credit card payment options provides a variety of advantages.
  • some credit card providers offer promotions that make spending on one card more attractive than another (e.g., double-miles on your Alaska Airlines Visa card for gas and grocery purchases made during February). Other promotions sometime include a lump-sum award of miles for new account holders after a threshold charge total has been reached (e.g., get 50,000 miles on your new CapitalOne Visa card after you've made $5,000 of purchases within the first five months.)
  • a shopper may be working to accumulate purchases on one particular card in order to reach a desired reward level (e.g., reaching 50,000 miles to qualify for a Delta ticket to Europe).
  • the ability to easily select a desired card from among an assortment of cards is a feature lacking in many existing mobile payment systems.
  • a smartphone programmed with a virtual wallet provides a user interface to present a wallet of virtual credit cards from which a user can pick when making a purchase.
  • Data is conveyed optically from the phone to a cooperating system, such as a point of sale terminal or another smartphone.
  • the phone containing the virtual cards presents a graphical illustration of the selected card on the screen.
  • Hidden in this graphical illustration i.e., steganographically encoded
  • This transaction data may provide information about the selected card, and may also provide context data used to create a session key for security.
  • a virtual wallet may receive payments, credits and rewards, as well as initiate payments.
  • Figs. 1 and 2 show a fliptych user interface used in certain embodiments to allow a user to select a desired card from a virtual wallet.
  • Figs. 3A and 3B show alternative card selection user interfaces.
  • Fig. 4A shows artwork for a selected card, steganographically encoded with card and authentication information, displayed on a smartphone screen for optical sensing by a cooperating system.
  • Fig. 4B is similar to Fig. 4A, but uses overt machine readable encoding (i.e., a barcode) instead of steganographic encoding, to optically convey information to the cooperating system.
  • machine readable encoding i.e., a barcode
  • Fig. 5 illustrates a common type of credit card transaction processing.
  • Fig. 6 shows a block diagram of a system in which a user's mobile device optically communicates with a cooperating system.
  • Fig. 7 is a flow chart detailing acts of an illustrative method.
  • Figs. 8 and 9 show screenshots of a user interface for selecting and presenting two cards to a vendor.
  • Figs. 10A and 10B show screenshots of an alternative user interface for selecting and presenting multiple cards to a vendor.
  • Fig. IOC illustrates how a payment can be split between two payment cards, in accordance with one aspect of the present technology.
  • Fig. 11 shows a payment user interface that presents a tally of items for purchase together with payment card artwork, and also provides for user signature.
  • Figs. 12A-12D show how checkout tallies can be customized per user preference.
  • Figs. 13A-13C show how authentication can employ steganographically-conveyed context data, an anti-phishing mutual validation system, and signature collection - all for increased security.
  • Figs. 14 and 15 show an authentication arrangement using photographs earlier captured by the user and stored on the smartphone.
  • Fig. 16 is a diagram showing a payload coding and transmission scheme.
  • Figs. 17A and 17B are diagrams showing communication pathways.
  • One aspect of the present technology concerns payment technologies, including auctions to determine which financial vendor will facilitate a transaction.
  • payment technologies including auctions to determine which financial vendor will facilitate a transaction.
  • a few particular embodiments are described below, from which various features and advantages will become apparent.
  • One particular method employs a user's portable device, such as a smartphone.
  • a smartphone As is familiar, such devices include a variety of components, e.g. a touch screen display, a processor, a memory, various sensor modules, etc.
  • an electronic payment module comprising software instructions that cause the device to present a user interface (UI) on the display.
  • This electronic payment module (and/or a UI provided by such) is sometimes referred to herein as a "virtual wallet”.
  • a user interface is shown in Fig. 1. The depicted user interface shows graphical
  • wallet cards representations of plural different cards of the sort typically carried in a user's wallet, e.g., credit cards, shopping loyalty cards, frequent flier membership cards, etc.
  • wallet cards e.g., credit cards, shopping loyalty cards, frequent flier membership cards, etc.
  • the software enables the user to scroll through the collection of cards and select one or more for use in a payment transaction, using a fliptych arrangement.
  • fliptych is the generic name for the style of interface popularized by Apple under the name "Cover Flow."
  • a desired card a Visa card in
  • Fig. 1 it is selected for use in the transaction by a user signal, such as a single-tap on the touch screen.
  • a user signal such as a single-tap on the touch screen.
  • a double-tap causes the depicted card to virtually flip-over and reveal, on its back side, information about recent account usage and available credit.
  • FIG. 3 A shows another form of UI - a scrollable display of thumbnails. This UI illustrates that representations of cards other than faithful card depictions can be employed. (Note the logo, rather than the card image, to represent the MasterCard payment service).
  • Still another alternative UI for card selection is that employed by Apple's Passbook software, shown in Fig. 3B.
  • the Passbook app is an organizer for passes such as movie tickets, plane and train boarding passes, gift cards, coupons, etc.
  • the device may perform a user security check - if required by the card issuer or by stored profile data configured by the user.
  • One security check is entry of a PIN or password, although there are many others.
  • the illustrative transaction method further involves generating context-based
  • This authentication data serves to assure the cooperating system that the smartphone is legitimate and is not, e.g., a fraudulent "replay attack" of the system.
  • the smartphone displays corresponding artwork on its display, as shown in Fig. 4A.
  • This artwork visually indicates the selected payment service, thereby permitting the user to quickly check that the correct payment card has been selected.
  • the card number, a logo distinctive of the selected payment service (e.g., an American Express, Visa or MasterCard logo) and/or card issuer (e.g., US Bank, Bank of America) can be included in the artwork, for viewing by the user.
  • the smartphone display shown in Fig. 4A indicates the selected payment service, it also includes the payment service account data (e.g., account number, owner name, country code, and card expiration date), as well as the context-based authentication data.
  • the payment service account data e.g., account number, owner name, country code, and card expiration date
  • the context-based authentication data e.g., the payment service account data
  • This information is not evident in the Fig. 4A artwork because it is hidden, using steganographic encoding (digital watermarking). However, such information can be decoded from the artwork by a corresponding (digital watermark) detector. Alternatively, such information can be conveyed otherwise, such as by other forms of machine-readable encoding (e.g., the barcode shown in Fig. 4B).
  • the user shows the artwork on the phone display to a sensor (e.g., a camera) of a cooperating system, such as a point of sale (POS) terminal, or a clerk's portable device, which captures one or more frames of imagery depicting the display.
  • a sensor e.g., a camera
  • POS point of sale
  • a clerk's portable device which captures one or more frames of imagery depicting the display.
  • the user holds the smartphone in front of a fixed camera, such as at a self-checkout terminal.
  • a POS terminal camera, or a smartphone camera is positioned (e.g., by a checkout clerk) so as to capture an image of the smartphone screen.
  • the user puts the smartphone, display facing up, on a conveyor of a grocery checkout, where it is imaged by the same camera(s) that is used to identify products for checkout. In all such arrangements, information is conveyed optically from the user device to the cooperating system.
  • the cooperating system decodes the account data and authentication data from the captured imagery.
  • the transaction is next security-checked by use of the authentication data.
  • Corresponding transaction information is then forwarded to the merchant's bank for processing. From this point on, the payment transaction may proceed in the conventional manner.
  • Fig. 5 illustrates a credit card approval process for a typical transaction.
  • Fig. 6 shows some of the hardware elements involved in this embodiment, namely a user's smartphone, and a cooperating system. These elements are depicted as having identical components (which may be the case, e.g., if the cooperating system is another smartphone). The dashed lines illustrate that the camera of the cooperating system captures imagery from the display of the user smartphone.
  • Fig. 7 summarizes a few aspects of the above-described embodiment in flow chart form.
  • the authentication data used in the detailed embodiment can be of various types, and can serve various roles, as detailed in the following discussion.
  • a security vulnerability of many systems is the so-called "replay attack.”
  • a perpetrator collects data from a valid transaction, and later re -uses it to fraudulently make a second transaction.
  • imagery captured by a POS terminal e.g., depicting the Fig. 4A virtual payment card of a user
  • this same imagery might later be employed to mimic presentation of a valid payment card for any number of further transactions.
  • a simple case would be the perpetrator printing a captured image of the Fig. 4A screen display, and presenting the printed picture to a camera at a self-service checkout terminal to "pay" for merchandise.
  • the authentication data of the present system defeats this type of attack.
  • authentication data is of a character that naturally changes from transaction to transaction.
  • a simple example is time or data. If this information is encoded in the image, the cooperating system can check that the decoded information matches its own assessment of the time/date.
  • some smartphones now include barometric pressure sensors.
  • the barometric pressure currently sensed by the smartphone sensor can be among the data provided from the smartphone display to the cooperating system.
  • the cooperating system can check a barometric sensor of its own, and confirm that the received information matches within some margin of error, e.g., 1 millibar.
  • Temperature is another atmospheric parameter than can be used in this fashion.
  • Smartphones are now conventionally equipped with a tri-axis magnetometer (compass), a tri-axis accelerometer and/or a tri-axis gyroscope. Data from these sensors allow the smartphone to characterize its position and motion, which information can be encoded in the displayed artwork. The cooperating system can analyze its captured imagery of the smartphone to make its own assessment of these data.
  • a POS terminal may analyze camera data to determine that the shopper's camera is moving 1 foot per second (i.e., on a moving conveyor), and is in a pose with its screen facing straight up, with its top orientated towards a compass direction of 322 degrees. If the authentication data decoded from the artwork displayed on the camera screen does not match this pose/motion data observed by the POS terminal, then something is awry and the transaction is refused.
  • a sample of ambient audio can be sensed by the smartphone microphone and processed, e.g., to classify it by type, or to decode an ambient digital watermark, or to generate an audio fingerprint.
  • An exemplary audio fingerprint may be generated by sensing the audio over a one second interval and determining the audio power in nine linear or logarithmic bands spanning 300 - 3000 Hz (e.g., , 300-387 Hz, 387-500 Hz, 500-646 Hz, 646-835 Hz, 835-1078 Hz, 1078-1392 Hz, 1392-1798 Hz, 1798-2323 Hz, and 2323-3000 Hz).
  • An eight bit fingerprint is derived from this series of data.
  • the first bit is a "1" if the first band (300-387 Hz) has more energy than the band next-above (387-500Hz); else the first bit is a "0.” And so forth up through the eighth bit (which is a "1” if the eighth band (1798-2323 Hz) has more energy than the band next-above (2323-3000 Hz).
  • the POS terminal can similarly sample the audio environment, and compute its own fingerprint information. This information is then compared with that communicated from the user's smartphone, and checked for correspondence. (The POS terminal can repeatedly compute an audio fingerprint for successive one second sample intervals, and check the received data against the last several computed fingerprints for a match within an error threshold, such as a Euclidean distance.)
  • the POS terminal may emit a short burst of tones - simultaneously or sequentially.
  • the smartphone microphone senses these tones, and
  • a sequence of tones may be
  • the POS terminal can influence or dictate, e.g., a fingerprint value that should reported back from the smartphone.
  • This is a form of challenge -response authentication.
  • the POS terminal issues a challenge (e.g., a particular combination or sequence of tones), and the smartphone must respond with a response that varies in accordance with the challenge.
  • the response from the smartphone is checked against that expected by the POS terminal.
  • the smartphone may be held to face towards the camera of a POS terminal.
  • a collection of colored LEDs may be positioned next to the camera of the POS terminal, and may be controlled by the POS processor to shine colored light towards the smartphone.
  • the POS system may illuminate a blue LED.
  • a next transaction it may illuminate an orange LED.
  • the smartphone senses the color illumination from its camera (i.e., the smartphone camera on the front of the device, adjacent the display screen), and encodes this information in the artwork displayed on the phone screen.
  • the POS terminal checks the color information reported from the smartphone (via the encoded artwork) with information about the color of LED illuminated for the transaction, to check for correspondence.
  • LEDs are activated in a sequence to emit a series of colors that varies over time.
  • This time- varying information can be reported back via the displayed artwork - either over time (e.g., the artwork displayed by the smartphone changes (steganographically) in response to each change in LED color), or the smartphone can process the sequence of different colors into a single datum.
  • the POS terminal may be capable of emitting ten different colors of light, and it issues a sequence of three of these colors - each for 100 milliseconds, in a repeating pattern.
  • the smartphone senses the sequence, and then reports back a three digit decimal number - each digit representing one of the colors.
  • the POS checks the received number to confirm that the three digits correspond to the three colors of illumination being presented, and that they were sensed in the correct order.
  • time-varying authentication data can be similarly sensed by the smartphone and reported back to the cooperating system as authentication data.
  • the authentication data can also be used to secure the payment card information against eavesdropping (e.g., a form of "man-in-the-middle" attack).
  • eavesdropping e.g., a form of "man-in-the-middle" attack.
  • the information encoded in the displayed artwork desirably is encrypted using a key.
  • This key can be based on the authentication data.
  • the smartphone presenting the information can derive the key from its sensed context data (e.g., audio, imagery, pose, motion, environment, etc.), yielding a context-dependent session key.
  • the cooperating POS system makes a parallel assessment based on its sensed context data, from which it derives a matching session key.
  • the authentication data thus is used to create a (context-dependent) secure private channel through which information is conveyed between the smartphone and the POS system.
  • a simple one is an exclusive- OR operation, by which bits of the message are XOR-d with bits of the key.
  • the resulting encrypted data string is encoded in the artwork presented on the smartphone screen.
  • the POS system recovers this encrypted data from captured imagery of the phone, and applies the same key, in the same XOR operation, to recover the bits of the original message.
  • More sophisticated implementations employ encryption algorithms such as DES, SHA1, MD5, etc.
  • Additional security can be provided by use of digital signature technology, which may be used by the POS system to provide for authentication (and non-repudiation) of the information received from the smartphone (and vice-versa, if desired).
  • information identifying the phone or user is conveyed from the phone to the POS system (e.g., via the encoded artwork displayed on the phone screen).
  • This identifier can take various forms.
  • Equipment Identity data - an identifier that uniquely identifies a phone.
  • the IMEI can be displayed on most phones by entering *#06# on the keypad.
  • IMSI International Mobile Subscriber Identity
  • Still other identifiers can be derived using known device fingerprinting techniques - based on parameter data collected from the phone, which in the aggregate distinguishes that phone from others. (All such arrangements may be regarded as a hardware ID.)
  • This identifier can be conveyed from the phone to the POS system in encrypted form, e.g., using context-based authentication data as described above.
  • the POS system Upon receipt of the identifier, the POS system consults a registry (e.g., a certificate authority) to obtain a public key (of a public -private cryptographic key pair) associated with that identifier.
  • a registry e.g., a certificate authority
  • This key may be stored in the phone's memory.
  • Information that may be encrypted in this fashion includes the payment card data.
  • the POS system uses the public key that it obtained from the certificate authority to decrypt this information. Because the communicated information is signed with a key that allows for its decryption using the public key obtained from the certificate authority, the information is known by the POS system to have originated from the identified phone/user.
  • the public/private key pairs may be issued by a bank or other party involved in the transaction processing. The same party, or another, may operate the certificate authority.)
  • a secondary check can be made to determine if the card information provided is associated with the phone, creating a second layer of security for a would-be attacker to surmount (beyond registering a fraudulent phone within the system, they would also have to associate the copied card information for a replay attack with the fraudulent phone).
  • the context based authentication data can also be encrypted with the private key, and decoded with the corresponding public key obtained from the certificate authority.
  • context-based authentication data is encrypted with a key that is tied to the device (e.g., via an IMEI identifier through a certificate authority), then this authentication data is logically bound to both the context and the user device.
  • PAFs physically unclonable functions
  • These may include but are not limited to shot-noise and temporal noise of the camera, properties of the image processing pipeline (compression artifacts, tonal curves influenced by Auto White Balance or other operations), etc.
  • properties of the display of the mobile device can be used for this same purpose, such as dead pixels or fluctuations of display brightness as a function of time or power.
  • Patent 7,370,190 provides additional information about physically unclonable functions, and their uses - technology with which the artisan is presumed to be familiar.
  • the system is more economical than all magnetic stripe and RFID systems because no physical cards or chips are required. (This is a particular savings when contrasted with chip card systems, due to the microprocessors and gold-plated interfaces typically used in such cards.) Nor is there any cost associated with distributing cards, confirming their safe receipt, and attending to their activation. Instead, credentials are distributed by electronically sending a file of data corresponding to a wallet card - encrypted and digitally signed by the issuing bank - to the phone, and using that file data to add the card to the smartphone wallet.
  • the installation and activation of the card can be tied to various unique aspects of the device and/or user characteristics, such as, for example, a hardware ID or a hash of user history or personal characteristics data.
  • a still further advantage is that the present technology is helpful in alleviating piriformis syndrome.
  • This syndrome involves inflammation of the sciatic nerve due to pressure in the gluteal/pelvic region.
  • a common cause of such pressure is presence of a large wallet in a person's rear pocket, which displaces customary pelvic alignment when sitting.
  • the wallet's volume is reduced, reducing attendant compression of the sciatic nerve. Elimination of the wallet requirement also improves security and convenience of payment processing for users.
  • the UI on payment module of the user's smartphone permits selection of two or more cards from the virtual wallet.
  • One is a payment card, and the other may be a loyalty ("merchant") card.
  • Data corresponding to both cards may be optically conveyed to the cooperating system via the artwork presented on the display of the user's smartphone.
  • Fig. 8 shows one such user interface.
  • the user flips through the deck of virtual wallet cards to find a first desired card. Instead of the user tapping the card for selection, a sweeping gesture is used to move the virtual card above the deck (as shown by the Visa card in Fig. 8), while the rest of the virtual deck slides down to make room. The user then continues flipping through the deck to locate a second card, which is selected by tapping.
  • a sweeping gesture is used to move the virtual card above the deck (as shown by the Visa card in Fig. 8), while the rest of the virtual deck slides down to make room.
  • the user then continues flipping through the deck to locate a second card, which is selected by tapping.
  • the phone screen presents artwork representing both the selected payment card, and the other (merchant) card, as shown in Fig. 9.
  • Fig. 10A shows another style of user interface permitting selection of multiple wallet cards.
  • thumbnails of different cards are organized by type along the right edge: payment cards, loyalty cards, gift and coupon cards, and cents-back cards.
  • the thumbnails presented on the right side of the UI are ordered so that the card(s) that are most likely to be used in a given context are the most conspicuous (e.g., not partially occluded by other cards).
  • the Safeway loyalty card would be most readily available.
  • the Visa card thumbnail would be positioned at a preferred location relative to the other payment card options. Forward chaining of inference can be used to predict which cards are most likely to be used in different situations.
  • the user slides thumbnails of selected cards towards the center of the screen where they expand and stack, as shown in Fig. 10B.
  • the user may assemble a recipe of cards including a credit card, a pair of coupon cards, a gift card, a loyalty card, and a cents-back card, while the grocery clerk is scanning items.
  • the deck is single-tapped (or in another embodiment double-tapped) to indicate that the user' s selection is completed.
  • the displayed artwork is again encoded with information, as described earlier, for optical reading by a cooperating system.
  • the artwork can include a background pattern 102, and this background pattern can also be encoded (thereby expanding the payload size and/or increasing the encoding robustness).
  • a visual indicia can be presented on the screen indicating that the artwork has been steganographically-encoded, and is ready to present for payment. For example, after the user has tapped the stack, and the artwork has been encoded, dark or other distinctive borders can appear around the card depictions.
  • a user interface can also be employed to split charges between two payment cards. Both cards may be in the name of the same person, or cards from two persons may be used to split a charge. (One such example is a family in which a weekly allowance is issued to teens by deposits to a prepaid debit card. A parent may have such a debit card for a teen in their smartphone wallet, and may occasionally agree to split the costs of a purchase with the teen.)
  • the artwork presented in one such UI case includes a hybrid card - a graphic composed partly of artwork associated with one card, and partly of artwork associated with another card.
  • a user interface feature 103 that can be touched by the user on the touch screen and slid right or left to apportion a charge between the two cards in a desired manner.
  • the illustrated UI shows the split detailed in percentage (30%/70%), but a split detailed in dollars could alternatively, or additionally, be displayed.
  • UI device user interface
  • an electronic payment module or "virtual wallet” comprising software instructions and/or libraries that cause the device to present the user interface (UI) on the display.
  • the virtual wallet may also include, e.g., frequent flyer account information, reward program information, membership information, loyalty membership information, coupons, discount codes, rebates, etc.
  • the user may indicate through the UI that she is ready to check out and purchase the cart items. If the UI cooperates with a touchscreen interface the user may indicate by touching the screen, flipping through various screens, scrolling, checking boxes, selecting icons, etc.
  • an auction is launched to determine which financial vendor associated with her virtual wallet will facilitate the financial transaction.
  • a solicitation of offers is launched to gather offers from the financial vendors associated with her virtual wallet.
  • the virtual wallet can launch the solicitation or auction in a number of ways.
  • the virtual wallet can communicate with the various financial vendors associated with the user's different payment options.
  • Cart total and contents, store and user location(s), user credit history, etc. can be forwarded to the different financial institutions to consider as they bid to facilitate the user's transaction.
  • the cart's total is $97.23
  • American Express may, for example, decide to offer a discount to the user if she uses her American Express account. With the discount the transaction total may now only cost the user, e.g., $92.37.
  • American Express may decide to offer the discount in exchange for promotional or marketing opportunities, pushing targeted advertisements or providing other opportunities to the user during or after the transaction.
  • American Express may have a discount arrangement with the store from which the user is shopping, e.g., Target or Amazon.com, and/or a discount arrangement for certain of the cart items. A portion of the discount can be passed along to the user.
  • American Express may base a decision to bid - and the amount of any discount associated with such bid - on a number of factors, e.g., the user's credit history with their American
  • another creditor e.g., PayPal's BillMeLater
  • PayPal's BillMeLater may decide based on the user's credit history that she is a solid risk. So BillMeLater low-balls the bid, offering a bargin- basement cost of $82.19 for the purchase, but BillMeLater couples their bid with the user's required acceptance to establish or increase a line of credit.
  • Another creditor may promise a discount + a certain number of reward or mileage points if the user makes selects them for the transaction. Still another may bid/offer an extended warranty if purchased with them.
  • the auction can be time-limited so bids must be submitted within a certain response time.
  • the user can be preapproved for certain deals or promotions based on her location, which will help reduce auctions time.
  • the virtual wallet may determine that the phone is currently located in Wal-Mart or Target.
  • Location information can be determined from user input, e.g., entering into the virtual wallet - or selecting from a screen pull-down or flip through - that the user is currently shopping in Wal-Mart, GPS information (e.g., coupled with a search of GPS coordinates), environmental information sensed by the user device upon entering the store (e.g., image recognition from recent camera pictures, analyzing digitally watermarked audio playing in a store, calculating audio fingerprints of ambient audio, audio beacons like Apple's iBeacons, Wi-Fi information network, etc.), etc.
  • the virtual wallet can start to solicit bids from financial vendors associated with the virtual wallet or user as soon as the virtual wallet determines that the user is in a retail establishment, even though the user has not finished populating their cart and are not located at checkout. Incoming bids may then be based on all or some of the above factors, e.g., credit history, promotion opportunities, available discounts, etc., and less on the actual cart contents.
  • the virtual wallet can also start an auction or solicit offers when the first (or other) item is added to the cart.
  • the virtual wallet can also receive pre-authorization or firm bids from financial vendors. For example, Bank of America may decide that they are offering to the user a 3% discount for all in-store purchases at Wal-Mart made during the upcoming weekend. The virtual wallet stores this information and can present the offer if and when the user finds herself in Wal-Mart.
  • the pre-authorization may include or link to promotional opportunities to be displayed during or after purchase.
  • the user can select from the various bids to determine which financial vendor will facilitate her transaction. For example, a double tap on a graphic with the desired bid can initiate the transaction. The user can be prompted to confirm the transaction if desired.
  • the virtual wallet can be user-configured to present only those bids meeting certain criteria. For example, through a settings screen or user interface, the user may decide that she only wants to see and consider the top 2 or 3 bids with cash-only discounts; such a setting will result in the user interface only presenting such top bids. Or the user may be interested in mileage rewards, or credit opportunities; and these will be presented in the top bids. Or the user can decide NOT to be bothered with the decision and may select a "best-deal" mode where the virtual wallet selects a bid based on a plurality of factors including, e.g., deepest discount, best long term financing, and/or proximity to reward levels (e.g., the user only need 5000 more mileage points to qualify for a trip to Hawaii).
  • a plurality of factors including, e.g., deepest discount, best long term financing, and/or proximity to reward levels (e.g., the user only need 5000 more mileage points to qualify for a trip to Hawaii).
  • Such factors may be weighted according to user preference and a top bid can be determined as one with the highest overall weighting.
  • a virtual wallet may also be configured to track reward status.
  • a merchant may communicate with a virtual wallet (or a financial vendor represented in the virtual wallet) to issue a credit.
  • the refund may result in reward points being pulled from a rewards account. This information may be reflected in the virtual wallet.
  • the virtual wallet may also communicate with a broker or intermediary service.
  • the broker or intermediary service can aggregate information, vendor bids, pre-authorizations, promotions, advertising etc. and associate such with a user or user device.
  • the virtual wallet communicates with the broker who communicates (and may generate themselves) various bids and promotion opportunities back to the virtual wallet.
  • Auctions associated with the virtual wallet are not limited to retail checkout locations.
  • the virtual wallet can help find better deals on many other items and services.
  • a user can prompt the virtual wallet that they need gas. This may cause the virtual wallet to launch a search, auction and/or solicitation for the best possible deals.
  • the auction can consider the various cards and memberships that the user has in her wallet.
  • a user's wallet may include a Chevron rewards card and an American Express account. This information can be communicated to various financial vendors including Chevron and American Express (or their intermediaries). An incoming bid may be presented to the mobile device including additional gas points on the Chevron rewards card and/or a discount if the American Express card is used. If a local Chevron Station is running a promotion, such information can be communicated to the virtual wallet for presentation to the user as well.
  • the virtual wallet can be configured to communicate some or all details about a bid to a competing financial vendor - making the auction even more transparent to participating vendors.
  • a competing vendor may decide to alter their initial bid to sweeten the deal. For example, Shell may decide that they don't want to be outbid by Chevron, and they may send the virtual wallet a bid that is lower, includes more rewards, or otherwise try to seduce the user. Shell's response can be sent back to Chevron, or Chevron's intermediary, who may decide to sweeten their bid in response.
  • the auction can be geographically constricted, e.g., only gas stations within a pre-determined number of miles from a user are considered for an auction.
  • the virtual wallet can determine which stations meet this location criteria by cooperation with one of the many available software apps that determine such stations based on a user's location (e.g., Google Maps, GasBuddy, etc.). Once a station is chosen, the virtual wallet may launch mapping software on the mobile device, pass into the mapping software a winning station's address or GPS coordinates, so that the user can have step-by-step driving directions to the station.
  • the destination address, or the turn by turn instructions can simply be passed to the control system of a self-driving vehicle, which can drive itself to the gas station, and complete the transaction.
  • the virtual wallet may initiate an auction or solicitation based on other factors. For example, GPS coordinates may indicate that the user is located at or approaching a gas station. An auction may be launched based on such proximity information.
  • warnings may be communicated to the user's device (e.g., via a Bluetooth pairing between the car and mobile phone) and used by the virtual wallet to initiate an auction to provide the best deals to address the warning.
  • the virtual wallet need not completely reside on a user's smartphone.
  • components of such may be distributed to the cloud, or to other available devices for processing.
  • a virtual wallet may handoff direction to a car's onboard computer and let it do some or all of the direction.
  • a wallet shell resides on the cell phone.
  • the shell includes, e.g., graphic drivers and user interfaces to allow device display, user input and communication with a remote location. Storage of credit card information and other wallet contents are stored remotely, e.g., in the cloud.
  • a virtual wallet may cause a digital watermark detector (or fingerprint generator) to analyze background audio in a background collection mode.
  • a detector or generator may analyze audio accompanying radio, internet, TV, movies, all to decode watermarks (or calculates fingerprints) without requiring human invention.
  • the audio may include watermarks (or be processed to yield fingerprints) that link to information associated with advertising, store promotional, coupons, etc. (Instead of audio, the background collection mode may capture video or still imagery; such video or imagery may be processed to yield information.)
  • This information can be stored in the virtual wallet, e.g., according to store identifier, location, event, etc. In other embodiments, this information is stored in the cloud for access by the virtual wallet.
  • the virtual wallet can receive location or retail information, e.g., included in a signal emanating from an iBeacon, audio source or captured from imagery provided by the store (e.g., an in-store display, poster, etc.).
  • location or retail information e.g., included in a signal emanating from an iBeacon, audio source or captured from imagery provided by the store (e.g., an in-store display, poster, etc.).
  • the virtual wallet may use received location or retail information to search through stored or previously encountered audio or video derived information.
  • the virtual wallet can prompt the user if discounts, coupons, promotions are found, and may apply any such discounts/coupons at checkout.
  • the virtual wallet may also access a store map or product in-store location to help the user navigate to those products for which the virtual wallet has discounts or coupons. These may correspond to previously encountered advertising which the wallet has collected or caused to be stored.
  • Some embodiments benefit from using a relatively large payload (e.g., 500-2,500 bits) during a virtual wallet transaction.
  • the payload can be carried in a digital watermark that is embedded in displayed imagery or video, encoded in hearing range audio, or transmitted using a high frequency audio channel.
  • the payload may correspond with credit card or financial information (e.g., ISO/IEC 7813 information like track 1 and track 2 information), account information, loyalty information, etc.
  • Payload information may be stored or generated locally on a smartphone, or the smartphone may query a remotely-located repository to obtain such.
  • the remotely located repository provides a 1-time token which can be used for a single (sometimes application specific) transaction.
  • a token replaces or is a proxy for a credit card or account number, and is conveyed as payload information.
  • the 1-time token can be cryptographically associated with a user account or user payment.
  • a user presents their portable device to a point of sale station which includes an optical reader or digital camera.
  • the point of sale station is a portable device, e.g., like a smartphone, pad or tablet.
  • the user's portable device displays digital watermarked imagery on the device's display for capture by the station's reader or camera.
  • the displayed imagery can be a still image, e.g., an image or graphic representing a credit card, a picture of the family dog, an animation, etc.
  • a virtual wallet can be configured to control the display of the image or graphic so that multiple frames (or versions) of the same still image or graphic are cycled on the display.
  • the displayed images appear as if they are collectively a static image, and not a video-like rendering.
  • Each instance of the displayed image or graphic carries a payload component.
  • a first displayed image carries a first payload component
  • a second displayed image carries a second payload component
  • the nth-displayed image carries an nth-payload component (where n is an integer). Since the only change to each displayed image is a different payload component, which is generally hidden from human observation with digital watermarking, the displayed images appear static - as if they are collectively a single image - to a human observer of the smartphone display.
  • a decoder can be configured to analyze each separate image to decode the payload component located therein.
  • the payload components can take various forms.
  • a relatively large payload is segmented or divided into various portions.
  • the portions themselves can be used as the various components, or they can be processed for greater robustness, e.g., error correction encoded, and then used as the various payload components.
  • a first portion is provided as the first payload component, which is embedded with digital watermarking in the first image for display
  • a second portion is provided as the second payload component, which is embedded with digital watermarking in a second image for display, and so on.
  • each of the various payload portions includes, is appended to include, or is otherwise associated or supplemented with a relative payload position or portion identifier. This will help identify the particular payload portion when reassembling the whole payload upon detection.
  • a watermark detector receives image data depicting a display (e.g., smartphone display) captured overtime. Capture of imagery can be synchronized with cycled, displayed images.
  • the watermark detector analyzes captured images or video frames to detect digital watermarks hidden therein.
  • a hidden digital watermark includes a payload component.
  • the payload component corresponds to a payload portion and carries or is accompanied by a portion identifier (e.g., 1 of 12, or 3 of 12, etc.).
  • the watermark detector, or a processor associated with such detector combines decoded payload components and attempts to reconstruct the whole payload. For example, the payload portions may need to simply be concatenated to yield the entire payload.
  • the payload may need to be decrypted or decoded.
  • the detector or processor tracks the portion identifiers, and may prompt ongoing detection until all payload portions are successfully recovered. If the detector misses a payload component (e.g., 3 of 12), it preferably waits until that component is cycled back through the display and successful captured and decoded, or may direct communication with the display that it needs, e.g., payload component 3 of 12.
  • a payload component e.g., 3 of 12
  • the 12 image versions can be repeatedly cycled through the display, e.g., for a predetermined time (e.g., 3-30 seconds) or until stopped by the user or point of sale station communicating a successful read back to the virtual wallet. If the display has a frame rate of 24 frames per second, then the 12 embedded image versions can be collectively cycled twice per second (or more or less depending on display frame rates).
  • Fountain codes are record-breaking sparse-graph codes for channels with erasures, such as the internet, where files are transmitted in multiple small packets, each of which is either received without error or not received. Standard file transfer protocols simply chop a file up into K packet sized pieces, then repeatedly transmit each packet until it is successfully received. A back channel is required for the transmitter to find out which packets need retransmitting. In contrast, fountain codes make packets that are random functions of the whole file. The transmitter sprays packets at the receiver without any knowledge of which packets are received. Once the receiver has received any N packets, where N is just slightly greater than the original file size K, the whole file can be recovered.
  • Fountain codes are rateless in the sense that the number of encoded packets that can be generated from the source message is potentially limitless; and the number of encoded packets generated can be determined on the fly.
  • Fountain codes are universal because they are simultaneously nearoptimal for every erasure channel. Regardless of the statistics of the erasure events on the channel, we can send as many encoded packets as are needed in order for the decoder to recover the source data.
  • the source data can be decoded from any set of K0 encoded packets, for K0 slightly larger than K.
  • Fountain codes can also have fantastically small encoding and decoding complexities.”
  • fountain codes can transform a payload into an effectively large number of encoded data blobs (or components), such that the original payload can be reassemble given any subset of those data blobs, as long the same size, or a little more than the same size, of the original payload is recovered. This provides a "fountain" of encoded data; a receiver can reassemble the payload by catching enough "drops," regardless of which ones it gets and which ones it misses.
  • erasure codes e.g., fountain codes
  • the relatively large payload can be presented to a fountain code encoder, which creates a plurality of encoded data blobs (e.g., encoded
  • each encoded data blob is accompanied with an index or seed.
  • the index or seed allows the decoder to use a complementary decoding procedure to reconstruct the payload.
  • the encoder and decoder may agree on a pseudo-random number generator (or an indexed-based matrix generator).
  • the generator includes an nxn random bit non-singular matrix where n is the payload' s bit length.
  • the matrix can be processed with a dot product of the payload which yields yn outputs.
  • An index can be associated with each yn output, to allow reconstruction by the decoder.
  • Payload 170 is presented to a Fountain Code Generator 171.
  • Other types of erasure code generators may be used instead, e.g., Raptor Codes or LT codes (Luby Transform codes).
  • the payload 170 can be a relatively large payload (e.g., in comparison to other, smaller digital watermarking payloads).
  • Payload 170 preferably includes, e.g., 500-8k bits.
  • Payload 170 may include or may be appended to include additional error correction bits, e.g., CRC bits. Additional CRC bits can be added to the 880 bit payload example, e.g., 32 additional bits.
  • Fountain Code Generator 171 produces a plurality of coded outputs (or data blobs),
  • Data blob outputs are provided to a Digital Watermark Embedder 172.
  • Digital Watermark Embedder 172 uses the data blob outputs as payloads to be respectively hidden in image versions (Il-IN).
  • image version may correspond to a copy or buffered version of a static (or still) Image (I) 174 that the user (or virtual wallet) has selected to represent a financial account or credit card or the like.
  • image version may correspond to a video frame or video segment.
  • Digital Watermark Embedder 172 embeds a data blob (e.g., Yl) in an image version II and outputs such (resulting in watermarked image version Iwl) for display by Display 173.
  • Digital Watermark Embedder 172 continues to embed data blobs in image version, e.g., Y2 in 12 and output (Iw2) for display, Y3 in 13 and output (Iw3) for display and so on.
  • Parallel processing may be advantageously used to embed multiple image versions in parallel.
  • Digital Watermark Embedder 172 delegates embedding functions to other units.
  • Display 173 may include or cooperate with a GPU (graphics processing unit).
  • Watermark Embedder 172 may determine watermark tweaks (or changes) corresponding to embedding an output data blob in an image version and pass that information onto the GPU, which introduces the changes in an image version.
  • Digital Watermark Embedder 172 may calculate a watermark title (e.g., a watermark signal representing an output data blob) can convey such to another unit like the GPU.
  • the GPU may then consider other factors like a perceptual embedding map or human attention model and introduce the watermark title in an image version with consideration of the map or model.
  • the Fountain Code Generator 171, Digital Watermark Embedder 172 and image (I) may be housed and operated in a portable device like the smartphone which includes Display 173.
  • a portable device hosting the Display 173 communicates with a remotely- located device that hosts the Fountain Code Generator 171, Digital Watermark Embedder 172 and/or Image 174.
  • Embedded image versions Iwl ...Iwn may be stored or buffered for cycling for display on
  • Display 173 For example, if 24 image versions are embedded with data blobs, and if Display 173 has a frame rate of 24 frames per second, then the 24 embedded image versions can be collectively cycled once per second (each image version is shown for 1/24 ⁇ of a second).
  • Embedded image versions can be repeatedly cycled through the display one after another, e.g., for a predetermined time (e.g., 5-10 seconds) or until stopped by the user or point of sale terminal. For example, the user or terminal may communicating a successful read to the virtual wallet which terminates display.
  • a predetermined time e.g., 5-10 seconds
  • the user or terminal may communicating a successful read to the virtual wallet which terminates display.
  • a static image is being displayed since the changes in the different image versions are digital watermarking, which are generally imperceptible to the human eye. This can be referred to as a "static image display effect".
  • one configuration includes a non-singular random binary nxn matrix, where n is the payload' s bit length. So, for the above 880 bit payload (912 including CRC bits) example, a 912x912 matrix is provided. The matrix can be processed with a dot product of the payload (912 bits) to yields yl-yN outputs. Continuing this example, fountain code outputs each include, e.g., 120 bits. A matrix index can be combined with the outputs including, e.g., 5 additional bits per output. The index can be specifically associated with individual outputs yN, can be associated with a group of y outputs, and/or can be associated with the matrix itself.
  • the 125 bits can be error protected, e.g., by appending CRC bits (e.g., 24 bits for a total output data blob YN bit count of 149 bits per data blob). Error protection can be provided by the Fountain Code Generator 171 or the Digital Watermark Embedder 172, or both. For a typical application, about 6-180 data blobs can be used to reconstruct a message. In the 880 bit payload example, if 32 output blobs are used, then 32 corresponding image versions (each individual image version having one of the 32 data blobs digitally watermarked therein) can be embedded in separate versions of the image for display on the smartphone as discussed above.
  • CRC bits e.g., 24 bits for a total output data blob YN bit count of 149 bits per data blob.
  • Error protection can be provided by the Fountain Code Generator 171 or the Digital Watermark Embedder 172, or both.
  • about 6-180 data blobs can be used to reconstruct
  • the Fountain Code Generator 171 can be configured to operate on longer codes, such as with Galois Fields (e.g., GF(256)) discussed in US Patent Nos. 7,412,641, 7,971,129 and 8,006,160.
  • Galois Fields e.g., GF(256)
  • constructing the payload can begin as soon as a data blob has been decoded from a digital watermark. That is, not all data blobs need to be recovered first before payload reconstruction is initiated with a corresponding erasure code decoder (e.g., in one above example, a corresponding non-singular matrix).
  • a corresponding erasure code decoder e.g., in one above example, a corresponding non-singular matrix.
  • a payload may be segmented prior to fountain code encoding, with each segment having a corresponding number of output blobs.
  • other related coding schemes can be used with cycling imagery (including video frames) such as Raptor codes and LT codes.
  • a perceptual map can be calculated once, and then reused for each embedding of the image versions.
  • the map can be generated as soon as a user identifies an image to be used as transaction graphic, e.g., during registration or virtual wallet set up, which are prior to transactions.
  • Another way to avoid visual perceptibility of embedded watermarks is to vary embedding strengths based on timing or device sensor feedback. For example, a user may instruct their virtual wallet to display an image for optical sensing.
  • the displayed, cycled images may be embedding with a relatively lower embedding strength for a predetermined time, e.g., the first 0- 3 seconds which may correspond to the average time it takes a user to present the smartphone display to an optical reader. Then, for a second time period, e.g., for the next 3-7 seconds, the watermark strength of the displayed, cycled images is pumped up to a relatively stronger level since the display will be pointed at the optical reader, away from human observation.
  • the embedding strength may depend on device sensor feedback. For example, after initiating display of imagery, the smartphone may user gyroscope information to make embedding strength decisions. For example, after first movement (corresponding to positioning the display to an optical reader), the embedding strength may be increased, and after one or more movement detections, the embedding strength may be decreased (e.g., corresponding to movement away from the camera).
  • the embedding strength may be increased, and after one or more movement detections, the embedding strength may be decreased (e.g., corresponding to movement away from the camera).
  • gyroscope movements can be analyzed to identify user tendencies, and the embedder can be trained to recognize such movements to optimize watermark embedding strength.
  • Some operating systems limit user accessibility to camera captured imagery to accommodate, e.g., post-processing. For example, user may only have access to 24-30 fps.
  • a watermark detector is given access to a higher frame rate, e.g., 70-120 frames per second. Watermark embedding and detection are synchronized such that digital
  • watermarking can be embedded so that it can only be read from this higher frame rate. In other cases, additional information is obtained from the high frame rate detection, while still embedding some information for detection from a lower, standard frame rate.
  • Some of the above embodiments discuss a virtual wallet operating on a smartphone to cause display of a relatively large payload.
  • Our inventive techniques can be applied in a reverse manner, e.g., to a point of sale display which displays cycling imagery to a user's smartphone.
  • a payload can be communicated from the point of sale to a smartphone' s virtual wallet. This may be used as a confirmation of a transaction, or it may be as a transaction identifier which can be communicated by the smartphone to a 3 rd party (e.g., a credit card vendor, a PayPal like service, etc.).
  • the transaction identifier can be supplemented with account information by the virtual wallet to identify an account associated with the virtual wallet.
  • the 3 rd party uses the transaction identifier and the account information to facilitate payment to the vendor.
  • a confirmation of payment can be transmitted to the vender (e.g., from information included or associated with the transaction identifier) and/or virtual wallet.
  • Some users may prefer this system since financial information is not transmitted from the user to the retailer, but from the retailer to the user, to the 3 rd party.
  • smartphone in another embodiment, we use high frequency audio to convey a relatively large payload for use in a virtual wallet transaction.
  • smartphone includes a transmitter (e.g., a speaker). The transmitter emits high frequency audio to a receiver.
  • the high frequency audio includes a relatively large payload.
  • the smartphone At a point of sale check out, the smartphone is positioned in proximity of a receiver at the point of sale location. High frequency audio is emitted from the smartphone, which is received by the point of sale receiver.
  • the payload is decoded from the received audio, and the transaction proceeds.
  • a high frequency (HF) audio channel or an audible audio channel can be used to establish bi-directional communication between a virtual wallet and a point of sale location.
  • a financial transaction can proceed once communication is established.
  • a virtual wallet can cause its host smartphone to transmit a known high frequency audio message, e.g., the message is known to both the virtual wallet and to a receiver.
  • the receiver determines signal errors or a measure of signal error and communicates such back to the smartphone.
  • the return communication can use Bluetooth, high frequency audio, radio frequency or audible range audio, or the like.
  • the virtual wallet uses this return error signal to adjust (e.g., increase or decrease), if needed, the level of error correction and/or signal strength for it next transmitted audio signal, e.g., when transmitting a payload.
  • the payload may correspond to various information including account information, encrypted information and/or tokens as discussed above.
  • a point of sale receiver expects both captured audio + captured imagery to process or complete a financial transaction.
  • a virtual wallet can cause imagery to be cycled on its display, as discussed above.
  • a high frequency audio signal is generated to cooperate with presented imagery.
  • presented imagery may include financial credit card or account information
  • the transmitted high frequency audio signal may include an associated PIN for the financial information, an encryption key to decrypt the imagery payload, or an expected hash of the imagery payload.
  • the transaction can be conditioned on verifying an expected
  • a video or image watermark signal includes a key, PIN or hash that is associated with an audio signal payload.
  • the point of sale receiver may request, e.g., through a high frequency audio channel, that the virtual wallet transmit the corresponding audio message once the imagery is successfully received.
  • a transmitted audio signal (including, e.g., the pin, hash or key) may prompt a receiver to enable its camera to capture a to-be-presented display screen.
  • HF audio is used to help ensure that communication is taking place between a point of sale terminal (POS) and a device within some given distance, say, e.g., 1-6 feet.
  • POS point of sale terminal
  • a mobile device exchange public keys.
  • the public keys help establish a secure protocol. But even with the exchange of keys, the POS terminal does not know how far away the mobile device is, so it might be possible to spoof one or the other of the mobile device or the POS.
  • a ranging protocol test including three or more HF audio messages preferably takes place in the following manner:
  • the POS transmits a PN code, encrypted with its private key.
  • An example length could be, e.g., 128 bits.
  • the mobile device decrypts the PN code using the public key of the POS.
  • the POS transmits a different PN code, this time unencrypted, and of the same length as the previous PN code.
  • the mobile device Upon receipt of the second PN code, the mobile device calculates the XOR (or dot product or other combination) of the two PN codes and transmits the result back to the POS.
  • the POS receives the XOR'ed values, verifies them, and also verifies that the time delay between POS transmit and POS receive of the last two messages is less than the time required for sound to travel the expected distance (e.g., round trip of 6 feet) plus some nominal processing time. Processing time can be minimized by transmitting in full duplex mode between the device and POS.
  • Another audio safeguard is to use directional speakers to convey the audio signal.
  • a parametric speaker using ultrasonic carrier waves to transmit audio to listeners with a focused beam of sound. Since the beam is focused, only receivers in front of the parametric speaker can adequately detect transmitted audio. SoundLazer in the United States provides example speakers.
  • POS 180 point of sale terminal
  • mobile device 181 communicates payload information to POS 180.
  • This transaction may look like a conventional card payment transaction, e.g., as discussed in US Patent No. 8,099,368 (with reference to FIG. 1).
  • embedded image data or encoded audio data to communicate user payment information and additional data like date, time, geolocation, etc.
  • a watermark payload for this transaction scenario is likely a relatively large payload, e.g., including account information, credit card or a proxy of such (e.g., 1-time token).
  • the above payload encoding techniques can be effectively used in this forward transaction embodiment.
  • the payload can be presented on a display of mobile device 181, communicated with a HF audio signal transmitted by mobile device 180, or a combination of both, e.g., as discussed above in the "Message Payloads and More" section.
  • POS 180 receives the payload from mobile device 180 and communicates such to a transaction clearing house 182 (e.g., like a credit card processor, card issuer, etc.).
  • the clearing house determines whether the payment is authorized and returns an authorization or denial back to POS 180.
  • a virtual wallet may include a virtual representation of an identification document (ID).
  • ID an identification document
  • the virtual representation includes age information that may be validated by the POS - or a service cooperating with the POS - to determine whether the shopper is of a certain age.
  • the age information may include a cryptographic signature or data blob that can be processed by the POS (or sent to a remote service for further processing) to determine or verify the shopper's age. If alcohol is scanned during checkout the shopper can be prompted to present their virtual ID.
  • the ID can be selected by the shopper via the mobile device 181 user interface (e.g., swiping screens until the ID graphic is found). Once found or selected the virtual ID can be displayed on the mobile devices display for presentation to the POS's camera.
  • the virtual ID can communicate age information through digital watermarking embedded in a displayed image or in a graphical representation of a driver's license or other ID credential.
  • ID information is conveyed through an audio signal, e.g., a HF audio signal.
  • the POS or cooperating service can also verify whether the credential is authentic as well.
  • Another "forward" transaction involves a medium to smaller payload. For example, when communicating a payload including specific account information (e.g., like a retailer's stored value accounts, account no, loyal card, etc.). This transaction may even involve less sophisticated cameras, e.g., still cameras or low frame per second capture cameras. This information can be embedded, perhaps, in a single image frame or over a few frames.
  • specific account information e.g., like a retailer's stored value accounts, account no, loyal card, etc.
  • This transaction may even involve less sophisticated cameras, e.g., still cameras or low frame per second capture cameras. This information can be embedded, perhaps, in a single image frame or over a few frames.
  • an audio signal e.g., HF audio
  • a POS Once received by a POS, the transaction is processed by communicating the payload to a web service or network based processor. In some cases, the POS does not decode a received payload but merely communicates it along to the web service or processor for them to decode. The POS awaits authorization to allow the transaction.
  • a web service or network based processor In some cases, the POS does not decode a received payload but merely communicates it along to the web service or processor for them to decode. The POS awaits authorization to allow the transaction.
  • the present techniques work well with peer-2-peer devices.
  • the POS terminal may be embodied in a mobile device, equipped with a camera and microphone.
  • POS includes a display screen on which digital watermarked information is displayed during checkout.
  • the digital watermark information may include, e.g., transaction identifier, checkout station, merchant/payee identifier, cost, and/or additional data such as date, time, geolocation, etc.
  • Mobile device 181 captures imagery of the display with its camera and analyzes such to detect the hidden digital watermarking information.
  • the digital watermarking information is decoded and communicated, preferably along with user selected account or payment information stored in her virtual wallet, to a remote 3 rd party who facilitates the transaction. For example, the
  • 3 rd party verifies the shopper account or payment information and determines whether to authorize the transaction.
  • the authorization/denial can be communicated directly back to the POS 180 from the third party, or an authorization token can be transmitted back to the mobile device, which communicates such to the POS.
  • the POS can analyze the token, or call a service to analyze such for them, to verify the authentication.
  • the 3 rd party may prompt the user to confirm the transaction.
  • the 3 rd party may provide a verification clue (e.g., a user preselected, arbitrary image) to the user to help ensure trust, provide the amount to be authorized (e.g., $88.17) and ask the user to click "yes" or "no" to authorize.
  • the verification clue may have been selected or provided by the user during account registration.
  • Receiving the verification clue from the 3 rd party provides another level of security. Instead of clicking a UI graphic box, the user may shake the phone in a predetermined manner to authorize or decline the transaction.
  • the mobile device's gyroscope provides relative movement for the virtual wallet to interpret.
  • a static watermarked image or audio source can be located at checkout and scanned or microphone captured by the mobile device to initiate payment in the cloud.
  • the static watermarked image or audio may include information such as checkout station, merchant/payee identifier, retail location, etc.
  • the mobile device decodes the digital watermarking to obtain the static information and combine this with user selected account or payment information from the virtual wallet, and communicates the combined information to the 3 rd party clearing house.
  • the mobile device can also communicate a timestamp as well.
  • the POS can also communicate the transaction amount, checkout station identifier, retailer identifier, etc. along with a timestamp to the 3 rd party clearing house.
  • the 3 rd party clearing house marries the POS information with the mobile device information (e.g., by matching up retailer identifiers and timestamps) and determines whether to authorize the transaction. Like above, the mobile device can be prompted by the 3 rd party to confirm payment or authorization. Once authorized, the 3 rd party transmits an authorization code to the POS directly or through the mobile device.
  • 3 rd party can include information to facilitate system or physical access.
  • the system may include, e.g., a mobile device, laptop, desktop, web service, remote database or cloud processor, communications network, etc.
  • a user can select a card (e.g., a graphic) from their virtual wallet.
  • the selected card When displayed on a mobile device display, the selected card (or plural displayed versions of the card) includes digital watermarking hidden therein.
  • the watermarking conveys information to facilitate system access.
  • the system includes a camera which captures imagery corresponding to the mobile device display. Captured imagery is analyzed to decode the digital watermarking information. The information is compared to stored, expected information to determine whether to allow access. The information or a portion of the information may have a cryptographic relationship with the stored, expected information.
  • the virtual wallet may generate or receive a 1-time token which is time dependent. This 1-time token can be analyzed by the system (who has access to corresponding a key or token) to determine whether to allow access.
  • the virtual wallet may prompt for user input prior to displaying a selected card. For example, the user may be prompted to swipe a finger on a mobile devices fingerprint reader, or show an eye to a camera for retina detection, or enter a password or PIN.
  • the virtual wallet may cause the home screen, background or locked screen on a mobile device to include the system access digital watermark information.
  • a virtual wallet may include a setting to embed such screens or backgrounds with digital watermarking. This will allow a user to show the screens without accessing the virtual wallet interface and scrolling through to find the access card representation.
  • a virtual wallet may cause a speaker to emit a HF audio signal which includes system access information.
  • a system microphone captures the HF audio and the system analyzes such to decode the information therefrom.
  • system access requires a combination of both audio and imagery.
  • the visual constructs provided above can also be utilized both in a wristwatch form- factor, and for users wearing glasses.
  • the paradigm of card selection can leverage the inherit properties of a watch form factor to facilitate selection.
  • One implementation may consist of the user running a finger around the bezel (device presumed to be circular for this example), to effect scrolling through the stack of cards.
  • Simple motion of the watch may facilitate the same navigation by tilting the watch (e.g., rotation at the wrist). Payment would be facilitated the same way by showing the wearer's wrist watch to the cooperating device.
  • the selection and validation process may occur through gaze tracking, blinking or any other known UI construct.
  • a secondary digital device containing a display (a smartphone, a digitally connected watch such as the Pebble, or possibly a media player).
  • the selected card would be rendered on the secondary device to complete the transaction as before.
  • a portable user device can project a display, for sensing by the POS system
  • Capturing imagery with eyewear may have additional benefits. For example, when capturing imagery of a point of sale (POS) display (e.g., at a reverse pathway checkout station mentioned above) a user may place a finger or subset of fingers in the eyewear's field of view. The camera captures the fingers (including the fingerprints) in the same image frames when it captures the display.
  • a virtual wallet or a processor in communication with a virtual wallet may process captured imagery. Digital watermarks are decoded from the imagery corresponding to the display, and human fingerprint recognition is used to determine if the fingerprints correspond to an owner or authorized user of the virtual wallet. Authorization of a transaction can be conditioned on a successful biometric match.
  • imagery is only captured (or is only used for the authentication or transaction) when a finger(s) or fingerprint(s) is detected in the field of view. This ensures that captured (or used) imagery will include a fingerprint for analysis.
  • Object recognition can analyze image data to detect the presence of a finger and then collect images that do.
  • FIG. 11 shows an arrangement in which a checkout tally is presented on the user's smartphone as items are identified and priced by a point of sale terminal.
  • a user "signs" the touchscreen with a finger to signify approval.
  • a signature is technically not required for most payment card transactions, but there are advantages to obtaining a user's signature approving a charge. For example, some transaction networks charge lower fees if the users' express affirmance is collected.
  • a finger-on- touchscreen signature lacks the fidelity of a pen-on-paper signature, but can still be distinctive.
  • a user's touchscreen signature can be collected. This signature, or its characterizing features, can be sent to one or more of the parties in the transaction authorization process shown in Fig. 5, who can use this initial signature data as reference information against which to judge signatures collected in subsequent transactions.
  • Signatures can include finger or facial biometrics, such a thumbprint on the user's screen or capture of face using camera functions, or voiceprint, etc.
  • POS receipts detail items purchased in the order they are presented at checkout - which is perhaps the least useful order.
  • An excerpt from such a receipt is shown in Fig. 12A.
  • user preference information is stored in the phone and identifies the order in which items should be listed for that user.
  • Fig. 12B shows an alphabetical listing - permitting the user to quickly identify an item in the list.
  • Fig. 12C shows items listed by price - with the most expensive items topping the list, so that the user can quickly see where most of the money is being spent.
  • Fig. 12D breaks down the purchased items by reference to stored list data.
  • This list can be a listing of target foods that the user wants to include in a diet (e.g., foods in the
  • the Mediterranean diet can be a shopping list that identifies items the user intended to purchase.
  • the first part of the Fig. 12D tally identifies items that are purchased from the list.
  • the second part of the tally identifies items on the list that were not purchased. (Some stores may provide "runners" who go out to the shelves to fetch an item forgotten by the shopper, so that it can be added to the purchased items before leaving the store.)
  • the third part of the Fig. 12D tally identifies items that were purchased but not on the list (e.g., impulse purchases).
  • An additional layer of security in mobile payment systems can make use of imagery, e.g., captured by the smartphone.
  • Figs. 13A - 13C illustrate one such arrangement, used to further secure an American Express card transaction.
  • the detailed arrangement is akin to the SiteKey system, marketed by RSA Data Security.
  • the phone sends related data to a cooperating system (which may be in data communication with American Express or RSA).
  • a cooperating system which may be in data communication with American Express or RSA.
  • the cooperating system provides a challenge corresponding to that user/device/card for presentation on the phone screen.
  • This challenge includes an image and a SiteKey phrase.
  • the image is an excerpt of a quilt image, and the SiteKey is the name Mary Ann.
  • the image is drawn from the user's own photo collection, stored on the smartphone that is now engaged in the authentication process.
  • the user may have snapped a picture of the quilt while visiting a gift shop on vacation.
  • User- selection of one of the user's own images enables the user to select a SiteKey phrase that has some semantic relationship to the image (e.g., the user may have been with a friend Mary Ann when visiting the shop where the quilt was photographed).
  • the Descriptor is the word Napa. (Again, this word may be semantically related to the displayed image and/or the SiteKey. For example, it may have been during a vacation trip to Napa, California, that the user and Mary Ann visited the shop where the quilt was photographed.)
  • a cryptographic hash of the user-entered Descriptor is computed by the smartphone, and transmitted to the cooperating system for matching against reference Descriptor data earlier stored for that user's American Express account. If they match, a message is sent to the smartphone, causing it next to solicit the user's signature, as shown in Fig.
  • the signature screen may also include a tally of the items being purchased, or other transaction summary.
  • the transaction proceeds.
  • the user's image or a user selected image may appear on the merchant's terminal screen permitting a challenge response verification of identity by the store clerk.
  • a facial image can be manually checked and/or compared using facial biometrics algorithms.
  • Another challenge-response security system employs information harvested from one or more social network accounts of the user, rather than from the phone's image collection. For example, a user can be quizzed to name social network friends - information that may be protected from public inspection, but which was used in an enrollment phase. At both the enrollment phase, and in later use, the actual friends' names are not sent from the phone.
  • hashed data is use to permit the remote system to determine whether a user response (which may be selected from among several dummy data, as above) is a correct one.
  • Figs. 14 and 15 show a different authentication procedure.
  • a challenge image 141 is presented, and the user is instructed to tap one of plural candidate images to identify one that is related to the challenge image.
  • the correct, corresponding, image (142a in this case) is selected from the user's own collection of smartphone pictures (e.g., in the phone's Camera Roll data structure), as is the challenge image 141. If the user does not pick the correct candidate image from the presented array of images, the transaction is refused.
  • Fig. 15 details a preceding, enrollment, phase of operation, in which images are initially selected.
  • the user is instructed to pick one image from among those stored on the phone.
  • This user-picked image 141 is used as the reference image, and a copy of this image is sent to a cooperating system (e.g., at a bank or RSA Security).
  • the user is next instructed to pick several other images that are related to the reference image in some fashion. (For example, all of the picked images may have been captured during a particular vacation trip.) These latter images are not sent from the phone, but instead derivative data is sent, from which these pictures cannot be viewed.
  • the user selects images taken during the vacation to Napa.
  • An image of the quilt, photographed in the gift shop, is selected by the user as the reference image 141.
  • This picture is a good choice because it does not reveal private information of the user (e.g., it does not depict any family members, and it does not reveal any location information that might be sensitive), so the user is comfortable sharing the image with an authentication service.
  • the user picks several other images taken during the same trip for use as related, matching images.
  • the user-picked related images are indicated by a bold border. One shows two figures walking along a railroad track. Another shows a palm tree in front of a house.
  • image feature information may comprise, e.g., an image hash, or fingerprint, or color or texture or feature histograms, or information about dominant shapes and edges (e.g., content-based image descriptors of the sort commonly used by content-based image retrieval (CBIR) systems), etc.
  • This derived information is sent from the phone for storage at the authentication service, together with identifying information by which each such related image can be located on the user's smartphone. (E.g., file name, image date/time, check-sum, and/or image file size.)
  • the remote system when authentication is required (e.g., after a user/device/card has been identified for a transaction), the remote system sends the reference image 141 for display on the smartphone.
  • the remote system also sends identifying information for one of the several related images identified by the user (e.g., for the picture of the tomatoes on the counter).
  • the remote system also sends several dummy images.
  • the smartphone uses the identifying information (e.g., the image name) to search for the corresponding related image in the smartphone memory.
  • the phone next presents this image (142a), together with the dummy images received from the authentication service (142b, 142c,
  • the remote system may have instructed the smartphone to present the matching image (recalled from the phone's memory, based on the identification data) in the upper left position of the array of pictures.
  • the phone reports to the remote system the location, in the array of candidate pictures, touched by the user. If that touch is not in the upper left position, then the remote system judges the authentication test as failed.
  • the location of the user's tap is not reported to the remote system. Instead, the smartphone computes derived information from the image tapped by the user, and this information is sent to the remote system. The remote system compares this information with the derived information earlier received for the matching (tomatoes) image. If they do not correspond, the test is failed.
  • the pass/fail decision is made by the smartphone, based on its knowledge of placement of the matching image.
  • each of the candidate images 142a - 142d is similar in color and structure.
  • each of these images has a large area of red that passes through the center of the frame, angling up from the lower left. (That is, the roadster car is red, the notebook is red, and the ribbon bow is red.)
  • the derived information sent from the phone during the enrollment phase included color and shape parameters that characterized the matching images selected by the user.
  • the remote system searched for other images with similar color/shape characteristics.
  • This feature is important when the reference image and the matching images are thematically related. For example, if the user-selected reference and matching photos are from a camping trip and all show wilderness scenes, then a matching photo of a mountain taken by the user might be paired with dummy photos of mountains located by CBIR techniques. By such arrangement, the thematic relationship between a matching image and the reference image does not give a clue as to which of the candidate images 142 is the correct selection.
  • the tomatoes photo was used as the matching image.
  • another one of the matching images earlier identified by the user can be used (e.g., the photo of a palm tree in front of a house).
  • magstripe credit cards conform to ISO standards 7810, 7811 and 7813, which define the physical and data standards for such cards.
  • the data on the magstripe includes an account number, an owner name, a country code, and a card expiration date.
  • Chip cards include a chip - typically including a processor and a memory.
  • the memory stores the just-listed information, but in encrypted form.
  • the card employs a variety of common digital security techniques to deter attack, including encryption, challenge-response protocols, digital signatures, etc. Entry of a user's PIN is required for most transactions.
  • an ISO standard (7816) particularly defines the card requirements, and a widely used
  • EMV EuroPay/MasterCard/Visa
  • EMV Lite An updated version of EMV, termed EMV Lite, is being promoted by Morpho Cards, GmbH.
  • Static authentication methods build on those known from magnetic stripe cards.
  • information is conveyed uni-directionally, i.e., from the card, possibly through an intermediary (e.g., a POS system) to a testing system (e.g., a card issuer).
  • Static techniques can employ digital signatures, public-private keys, etc.
  • the user's name may be hashed, digitally signed with a private key associated with the system (or issuer), and the results stored in a chip card for transmission to the POS system.
  • the POS system receives this encrypted data from the card, together with the user name (in the clear). It applies the corresponding public key to decrypt the former, and compares this with a hash of the latter.
  • the present technology can be employed in systems using such known static
  • the present technology affords protection against replay attacks (e.g., through context-based techniques) - a liability to which conventional static authentication techniques are susceptible.
  • dynamic authentication This involves a back-and-forth between the payment credential and the testing system, and may comprise challenge-response methods.
  • the card-side of the transaction is conducted by the chip, for which the POS terminal commonly has a two-way dedicated interface.
  • the smartphone screen used in embodiments of the present technology which optically provides information to the cooperating system - cannot reciprocate and receive information from that system. Nonetheless, the present technology is also suitable for use with dynamic authentication methods.
  • the communication back from the system to the smartphone can be via signaling channels such as radio (NFC communication, WiFi, Zigbee, cellular) or audio.
  • Optical signaling can also be employed, e.g., a POS terminal can be equipped with an LED of a known spectral characteristic, which it controllably operates to convey data to the phone, which may be positioned (e.g., laying on a checkout conveyor) so that the phone camera receives optical signaling from this LED.
  • a POS terminal can be equipped with an LED of a known spectral characteristic, which it controllably operates to convey data to the phone, which may be positioned (e.g., laying on a checkout conveyor) so that the phone camera receives optical signaling from this LED.
  • the keys are accessed from the SE in the smartphone, and employed in a static authentication transaction (e.g., with information optically conveyed from the smartphone screen).
  • the remote system may respond to the phone (e.g., by radio) with a request to engage in a dynamic authentication, in which case the smartphone processor (or the SE) can respond in the required back-and-forth manner.
  • the key data and other secure information is stored in conventional smartphone memory - encrypted by the user's private key.
  • a cloud resource e.g., the card issuer
  • the POS system can delegate the parts of the transaction requiring this information to the issuing bank, based on bank-identifying information stored in the clear in the smartphone and provided to the POS system.
  • chip cards are appealing in some aspects, they are disadvantageous because they often require merchants to purchase specialized reader terminals that have the physical capability to probe the small electrical contacts on the face of such cards.
  • the card is typically stored in an insecure container - a wallet. In the event a card is stolen, the only remaining security is a PIN number.
  • embodiments of the present technology can employ the standards established for chip card systems and gain those associated benefits, while providing additional advantages such as cost savings (no specialized reader infrastructure required) and added security (the smartphone can provide many layers of security in addition to a PIN to address theft or loss of the phone).
  • a virtual wallet can facilitate receipt transmission and management.
  • the virtual wallet may request a receipt to be added to or accessible by the wallet - perhaps stored locally on the user device and/or in the cloud associated with a user or device account.
  • the virtual wallet communicates an account identifier, device ID or address to a participating terminal or vendor.
  • the terminal or vendor forwards the transaction receipt to the account, device or address.
  • the user may be prompted through a UI provided by the virtual wallet to add searchable metadata about the transaction or receipt (e.g., warranty information).
  • searchable metadata is collected by the virtual wallet itself in addition to or without user intervention.
  • Searchable metadata may be collected, e.g., by accessing and using transaction time, retailer name and location, items purchased, retention information, OCR-produced data if the receipt is in image form or .pdf format, etc.
  • the receipt can be provided by the retailer with searchable text (e.g., in an XML file), e.g., including items purchased, return information, warranty information, store location and hours, price, etc.
  • Searchable text can be indexed to facilitate rapid future searching.
  • the receipt is accessible through the virtual wallet, e.g., by a user selecting a UTprovided icon next to a corresponding transaction.
  • the virtual wallet preferably provides a UI through which receipts and other transaction information may be searched.
  • the user inputs information, e.g., types information or selects categories, products, retailers from scrollable lists, via the search UI.
  • corresponding receipt search results are represented on the display for review by the user.
  • receipts can be marked for retention. This is helpful, e.g., for items under warranty.
  • Retention information can be used by the wallet to help expire receipts and other transaction information. For example, a user purchases a TV at Wal-Mart and a receipt is delivered for access by the virtual wallet.
  • the virtual wallet may receive a notification that a receipt is available for retrieval, and access a remote location to obtain receipt information.
  • Metadata is entered or accessed for the receipt and retention data is indexed or stored in an expiration table or calendar.
  • the virtual wallet uses the expiration table or calendar to expire receipts no longer deemed important or needed.
  • expire in this context may include deleting the receipt, deleting metadata associated with the receipt, and/or updating any remote storage of such.
  • Retention data can be augmented with any auction related information. For example, we mentioned above that a certain financial bidder may offer an extended warranty if a transaction is made using their account or service. Such a warranty extension may be added to the retention information so a receipt is not prematurely expired.
  • Receipts and the metadata associated with such can be updated to reflect returns or refunds.
  • the searchable metadata may also include notification information.
  • notification information For example, a user may be on the fence whether to keep the latest electronic gizmo purchased on a whim last week. In this case the use has 15 days (or other according to the store's return policy) to return the item.
  • Notification information can be stored and calendared for use by the virtual wallet (or a cooperating module) to send the user a reminder, e.g., via email, SMS or display notification pop-up via a UI, so that the 15 days doesn't come and go without notice.
  • the virtual wallet may manage and provide many different types of notifications. For example, bill- payment due dates, account balances, credit limits, offers, promotions and advertising are just a few examples of such.
  • Push-messages may be generated for urgent items in addition to having some type of a visual cue or icon within the virtual wallet that would indicate that my attention is needed.
  • a particular card or account in FIG. 3A may have a notification associated with it. (E.g., the user may have forgotten to authorize a monthly payment by its due date.)
  • the depicted card may jiggle, glow, shimmer, flash, strobe and/or break into an animated dance when the virtual wallet is accessed. This type of notification will visually alert the user to investigate the card further, and upon accessing such (e.g., by double tapping the animated card) the notification can be further displayed.
  • Medical and insurance information may also be stored and managed in a virtual wallet.
  • users have car insurance card(s), Medicare card(s), an Intraocular Lens card, and a Vaccess Port card, etc.
  • some of this info is preferably accessible without unlocking a mobile device that is hosting the virtual wallet, e.g., because if a user needs emergency medical care, they may not be conscious to unlock the device. Access to such emergency medical information may be accomplished by adding an Emergency Medical button to a device's unlock screen similar to the Emergency Call button.
  • a user can determine which information they want to provide access to via an Emergency Medial button through an operating systems settings screen or an access user interface associated with the virtual wallet.
  • emergency responders have an RFID card, NFC device or a digitally watermarked card that can be sensed by the mobile device to trigger unlocking the screen of a mobile device.
  • desired medial or insurance information is information is available on an initial splash screen, even if the phone is locked, and without needing to access an Emergency Medical button.
  • the information hosted by the virtual wallet can be stored in the cloud or at a remote location so that it is accessible from various user devices programmed with the virtual wallet (e.g., a virtual wallet app) or to cooperate with the virtual wallet and through which a user' s identity is authenticated.
  • Another device on which a virtual wallet can operate on is a game console.
  • gaming platforms include Microsoft's Xbox 360, Sony's PlayStation, Nintendo's DS and Wii Kyko PlayCube, OnLive's MicroConsole (a cloud-based gaming console), etc.
  • the virtual prize can be stored or accessed within the user's virtual wallet.
  • the prize may be represented by an XML file, an access code, a cryptographic code, software code, or a pointer to such.
  • the virtual wallet can facilitate the on-line sale or transfer (e.g., via eBay) of the virtual prize for real money or credit.
  • the wallet may include a virtual prize directory, folder or screen.
  • An eBay (or sell) icon may be displayed next to the virtual prize to allow a user to initiate a transfer, auction or sale of the virtual prize. Selecting the icon initiates an offer to sell, and prompts the virtual wallet to manage the interaction with eBay, e.g., by populating required For Sale fields gathered from the virtual prize's metadata, or prompting the user to insert additional information. (The virtual wallet can access an eBay API or mobile interface to seamlessly transfer such data.)
  • the virtual wallet can be used to transfer the virtual prize to the winning purchaser using the techniques (e.g., purchase) discussed in this document.
  • a virtual wallet may also provide an indication of trust.
  • a user may accumulate different trust indicators as they forage online, participate in transactions and interact in society. For example, a user may receive feedback or peer reviews after they participate in an online transaction, auction or in a retail store.
  • Another trust indicator may be a verification of age, residency and/or address.
  • Still another trust indicator may be a criminal background check performed by a trusted third party.
  • the virtual wallet may aggregate such indicators from a plurality of different sources to determine a composite trust score for the user. This trust score can be provided to potential bidder in a financial auction as a factor in deciding whether to offer a bid, and the content of such.
  • the trust score can also be provided as the user interacts through social media sites.
  • the trust score is anonymous. That is, it provides information about a user without disclosing the user's identity. A user can then interact online in an anonymous manner but still convey an indication of their trustworthiness, e.g., the virtual wallet can verify to others that a user is not a 53 year old pedophile, while still protecting their anonymity.
  • a virtual wallet may be tethered (e.g., include a cryptographical relationship) to device hardware.
  • a mobile device may include an SID card identifier, or may include other hardware information, which can be used as a device identifier.
  • a virtual wallet may anchor cards within the wallet to the device identifier(s) and, prior to use of a card - or the wallet itself - checks the device identifier(s) from the device with the device identifier(s) in the virtual wallet.
  • the identifiers should correspond in a predetermined manner (e.g., cryptographical relationship) before the virtual wallet allows a transaction. This will help prevent a wallet from being copied to a device that is not associated with the user. (Of course, a user may authorize a plurality of different devices to cooperate with their virtual wallet, and store device identifiers for each.)
  • a virtual wallet may send out a notification (e.g., to the user, credit reporting agency, or law enforcement) if the virtual wallet detects unauthorized use like use of the wallet on an unauthorized device.
  • a notification e.g., to the user, credit reporting agency, or law enforcement
  • the virtual wallet gathers information associated with a user's patterns and purchases. After building a baseline, it can notify a user, financial vendor or others when it detects activity that looks out of character (e.g., suspected as fraud) relative to the baseline.
  • the baseline may reflect a geographic component (e.g., North America) and if spending is detected outside of this component (e.g., in Europe) then a notification can be generated and sent.
  • the baseline may also access or incorporate other information to help guide its decision making.
  • the virtual wallet may access a user's online or locally stored calendar and determine that the user is traveling in Europe on vacation. So the geographical component is expanded during the vacation time period and a notification is not sent when European spending is detected.
  • a method employing a user's portable device including a display, one or more processors and a sensor, the method including acts of:
  • a portable device comprising:
  • a sensor to obtain information corresponding to a positioning or relative movement of the portable device
  • processors configured for:
  • the portable device of Bl in which the sensor comprises a gyroscope.
  • the portable device of Bl in which the changing a digital watermark embedding process comprises changing a relative embedding strength.
  • a portable device comprising:
  • a microphone for capturing ambient audio
  • memory for storing audio identifiers or information obtained from audio identifiers
  • processors configured for:
  • causing the portable device to operate in a background audio collection mode, in which during the mode audio is captured by the microphone without user involvement; processing audio captured in the background audio collection mode to yield one or more audio identifiers;
  • the portable device of CI in which the signaling source comprises an iBeacon or Bluetooth transmitter.
  • the portable device of C2 in which the information obtained from the one or more audio identifiers comprises a discount code or coupon, and in which the action comprises applying the discount code or coupon to a financial transaction involving the portable device.
  • the portable device of CI in which the processing audio comprises extracting fingerprints from the audio.
  • the portable device of CI in which the processing audio comprises decoding digital watermarking hidden in the audio.
  • a system comprising:
  • a portable device comprising: one or more processors, a high frequency audio transmitter and receiver, and a virtual wallet stored in memory, the virtual wallet comprising financial information;
  • a retail station comprising: one or more processors, a high frequency audio transmitter and receiver;
  • the virtual wallet configures the one or more processors of the portable device to transmit a known high frequency audio message, the message being known to both the virtual wallet and to the retail station;
  • the one or more processors of the retail station are configured to determine errors associated with the known high frequency audio message and cause an error message to be communicated to the virtual wallet;
  • the virtual wallet upon receipt of the error message, configures said one or more processors to transmit the financial information with a high frequency audio signal adapted according to the error message.
  • a portable device comprising:
  • processors configured for:
  • the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction.
  • the portable device of El in which the obtaining comprises generating the payload based on user input and on the financial information.
  • E3 The portable device of El in which said one or more processors are configured to operate as the erasure code generator, and in which the erasure code generator comprises a fountain code generator, in which the fountain code generator produces the plurality of outputs, from which a receiver can reassemble the payload by obtaining a subset of the plurality of outputs, the subset being less than the plurality of outputs.
  • E5. The portable device of El in which said one or more processors are configured for: i) generating a perceptibility map of the image, ii) storing the perceptibility map in said memory, and iii) reusing the perceptibility map when embedding the plurality of outputs in corresponding image copies.
  • E6 The portable device of El further comprising an audio transmitter, in which said one or more processors are configured to cause said audio transmitter to transmit an audio signal corresponding to the financial information.
  • E7. The portable device of E6 in which said audio transmitter comprises a high frequency audio transmitter.
  • E8 The portable device of E6 in which the audio signal comprises a pin, key or hash.
  • E9 The portable device of El in which the plurality of outputs comprises a subset of a total number of outputs provided by the erasure code generator.
  • said one or more processors are configured to interpret the user input which is received via said touch screen display.
  • El 1 The portable device of El in which said one or more processors are configured to cause the embedded image copies to be displayed so that a digital watermark reader analyzing captured image data representing the display can recover the payload.
  • a method employing a user's portable device including a touch screen display, one or more processors and a sensor, the method including acts of:
  • the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction.
  • the method Fl in which the obtaining a payload comprises generating the payload based on user input and on the financial information.
  • the method of Fl further comprising: causing the erasure code generator to produce the plurality of outputs, in which the erasure code generator comprises a fountain code generator, in which the fountain code generator produces the plurality of outputs, from which a receiver can reassemble the payload by obtaining a subset of the plurality of outputs, the subset being less than the plurality of outputs.
  • the method of Fl in which only one output of the plurality of outputs is embedded in any one image copy.
  • F5. The method of Fl further comprising: i) generating a perceptibility map of the image, ii) storing the perceptibility map in memory, and iii) reusing the perceptibility map when embedding the plurality of outputs in image copies.
  • F7 The method of F6 in which the audio transmitter comprises a high frequency audio transmitter.
  • F8 The method of F6 in which the audio signal comprises a pin, key or hash.
  • a method employing a user's portable device including a touch screen display, one or more processors and a sensor, the method including acts of:
  • G2 The method of Gl in which the portions comprise video frames or copies of an image.
  • the user interface identifying plural virtual wallet cards including plural payment service cards, said payment service cards representing plural possible payment services including at least one of American Express, VISA and MasterCard, the user interface enabling a user to select a desired one of said payment services for issuing a payment;
  • the artwork indicating the selected payment service and including a logo for American Express, Visa, or MasterCard;
  • the logo in the presented artwork confirms to the user that the desired payment service has been selected for a payment; and the method enables the user to issue payments using a user-selected one of said plural payment services, without requiring the user to carry plural physical cards for said payment services.
  • H3 The method of HI in which the user interface enables the user to select plural of said virtual wallet cards, one of the selected cards being a payment service card, and another of the selected cards being a merchant card, the method further including providing data corresponding to both the payment service card and the merchant card to the cooperating system.
  • H4 The method of HI in which the authentication data depends in part on data from a sensor module selected from the group: an audio sensor, a motion sensor, a pose sensor, a barometric pressure sensor, and a temperature sensor.
  • a sensor module selected from the group: an audio sensor, a motion sensor, a pose sensor, a barometric pressure sensor, and a temperature sensor.
  • H5. The method of HI in which the method further includes prompting the user for entry of correct validation information before the information is provided to the cooperating system, and wirelessly sending location data to a recipient if N consecutive attempts to enter correct validation information fail.
  • H6 The method of HI in which the authentication data is also user device-based, wherein the authentication data is logically bound to both context and to the user device.
  • H7 The method of HI that further includes presenting said payment user interface in response to user activation of a control in an online shopping user interface, through which the user has selected one or more items for purchase.
  • a method of alleviating piriformis syndrome, while still allowing card-related payment transactions comprising the acts: for each of plural physical payment cards in a user's wallet, storing a virtual counterpart thereto in a user's portable device, each of said physical payment cards in the user's wallet having a payment service associated therewith;
  • initiating a payment using a user-selected payment service, said user-selected payment service being associated with one of the cards removed from the user's wallet, said initiating including:
  • the portable device for optical sensing by a cooperating system, the artwork including a machine-readable representation of the
  • the payment process includes performing a first authentication act that makes use of data captured by a camera or microphone of the portable wireless device;
  • the payment process includes performing a second authentication act that makes use of data generated by a MEMS sensor of the portable wireless device.
  • Kl Kl.
  • An improved checkout system including a camera, a processor, and a memory, the memory containing instructions configuring the checkout system to perform acts including:
  • the same camera also using the same camera to capture second image data depicting artwork from a display of a user portable device, the artwork being associated with a payment service and including a VISA, MasterCard or American Express logo, the artwork also including machine- readable data encoding plural bit auxiliary data;
  • a method employing a user's portable device comprising: presenting, using a display of the device, a user interface that presents plural virtual wallet cards;
  • M2 The method of Ml in which a first of said two selected virtual wallet cards is associated with an American Express, Visa, or MasterCard payment service, and a second of said two selected virtual wallet cards is associated with a merchant.
  • the method of M2 which the two selected wallet cards comprise two virtual payment cards, and the method further includes providing a user interface feature enabling the user to apportion part of a payment to a first of the payment cards, and the balance of the payment to a second of the payment cards.
  • M5 The method of M3 in which the method further includes presenting a graphical image of a composite payment card, the graphical image including incomplete artwork associated with the first payment card, combined with incomplete artwork associated with the second payment card.
  • the data including a first image, and image-derived information corresponding to one or more further images, the first and further images having been captured by the user and related to each other, the further images not being viewable from the image-derived information;
  • N2 The method of Nl in which the image-derived information comprises a hash or fingerprint derived from the further image(s).
  • N3. The method of Nl in which the image-derived information comprises content-based image descriptors derived from the further image(s).
  • N5. The method of N4 that includes checking the user's selection by reference to the image-derived information.
  • An authentication method practiced using a smartphone characterized by presenting images on a screen of the smartphone and receiving a user response thereto, wherein two of said images were earlier captured by the user with a camera portion of said smartphone.
  • P5. The method of PI in which the sensor comprises a microphone, and said initiating a multi-party auction commences upon analysis of microphone captured audio.
  • the method of PI in which prior to said initiating a financial transaction, the method further comprises determining whether the financial transaction seems out of character relative to a baseline, in which the baseline includes user calendar information.
  • P7 The method of P6 further comprising issuing a notification when the financial transaction seems out of character.
  • the user interface identifying plural virtual wallet cards including plural payment service cards, said payment service cards representing plural possible payment services including at least one service from a group of services offered by American Express, VISA and MasterCard, the user interface enabling a user to select a desired one of said payment services for issuing a payment;
  • the artwork including a logo for American Express, Visa, or MasterCard;
  • the logo to graphically change to indicate a notification associated with the virtual wallet card represented by the logo, in which the change comprises at least one of: a jiggle, glow, shimmer, flash, strobe or animated dance.
  • a portable device comprising:
  • memory storing a virtual wallet including information associated with a plurality of financial vendors
  • processors configured for:
  • a portable device comprising:
  • memory for storing audio identifiers or information obtained from audio identifiers; and one or more processors configured for: causing the portable device to operate in a background audio collection mode, in which during the mode audio is captured by the microphone without user involvement; processing audio captured in the background audio collection mode to yield one or more audio identifiers;
  • the portable device of Ul in which the signaling source comprises an iBeacon or Bluetooth transmitter.
  • the information obtained from the one or more audio identifiers comprises a discount code or coupon, and in which the action comprises applying the discount code or coupon to a financial transaction involving the portable device.
  • U5. The portable device of Ul in which the processing audio comprises decoding digital watermarking hidden in the audio.
  • U6. The portable device of Ul in which the action comprises prompting the user via a message displayed on the touch screen display.
  • a high frequency audio channel transmitting a first code from a checkout terminal to a mobile device, the code being encrypted with a private key, which the mobile device may decrypt with a corresponding public key;
  • the audio signal comprising a processed result of the first code and the second code
  • V3 The method of VI in which the first code comprises a pseudo random bit sequence.
  • V4 The method of VI in which the high frequency audio channel comprises a parametric speaker for transmitting focused beams of sound.
  • a portable device comprising:
  • memory for storing an image, and for storing components of a virtual wallet; and one or more processors configured for:
  • controlling the video camera to capture imagery corresponding to a checkout terminal's display the display displaying imagery including digital watermarking information hidden therein, the information including transaction information; processing captured imagery to decode the digital watermarking to obtain the transaction information;
  • the portable device of Wl in which the display imagery comprises a plurality of image versions, with each version including at least one fountain code generator output.
  • the portable device of Wl in which the payment information includes a timestamp.
  • W5 The portable device of Wl in which the one or more processors are programmed for controlling display of the at least a portion of the verification clue.
  • the portable device of Wl further comprising a speaker, in which the one or more processors are configured for controlling output of a high frequency audio signal via the speaker, the high frequency audio signal comprising a message for the checkout terminal.
  • X3 The method of XI in which the imagery corresponds to a display, in which the display comprises the digital watermarking hidden in displayed images.
  • embodiments of the present technology preserve the familiar ergonomics of credit card usage, while streamlining user checkout. No longer must a user interact with an unfamiliar keypad at the grocery checkout to pay with a credit card (What button on this terminal do I press? Enter? Done? The unlabeled green one?). No longer must the user key- in a phone number on such a terminal to gain loyalty shopper benefits. Additional advantages accrue to the merchant: no investment is required for specialized hardware that has utility only for payment processing. (Now a camera, which can be used for product identification and other tasks, can be re-purposed for this additional use.) And both parties benefit by the reduction in fraud afforded by the various additional security
  • radio signals e.g., Bluetooth, Zigbee, etc.
  • NFC and RFID techniques can also be used.
  • audio can also be used.
  • card and authentication data can be modulated on an ultrasonic carrier, and transmitted from the phone's speaker to a microphone connected to the POS terminal.
  • the POS terminal can amplify and rectify the sensed ultrasonic signal to provide the corresponding digital data stream.
  • an audible burst of tones within the human hearing range can be employed similarly.
  • the data is conveyed as a watermark payload
  • cover audio can be used to convey different information. For example, if the user selects a VISA card credential, a clip of Beatles music, or a recording of a train whistle, can serve as the host audio that conveys the associated authentication/card information as a watermark payload. If the user selects a VISA card credential, a clip of Beatles music, or a recording of a train whistle, can serve as the host audio that conveys the associated authentication/card information as a watermark payload. If the user selects a VISA card credential, a clip of Beatles music, or a recording of a train whistle, can serve as the host audio that conveys the associated authentication/card information as a watermark payload. If the user selects a VISA card credential, a clip of Beatles music, or a recording of a train whistle, can serve as the host audio that conveys the associated authentication/card information as a watermark payload. If the user selects a
  • MasterCard credential, a BeeGees clip, or a recording of bird calls can serve as the host audio.
  • the user can select, or record, the different desired items of cover audio (e.g., identifying songs in the user's iTunes music library, or recording a spoken sentence or two), and can associate different payment credentials with different of these audio items.
  • the user can thereby conduct an auditory check that the correct payment credential has been selected. (If the user routinely uses a Visa card at Safeway - signaled by the Beatles song clip, and one day he is surprised to hear the BeeGees song clip playing during his Safeway checkout, then he is alerted that something is amiss.)
  • card data e.g., account name and number
  • the phone provides a data token, such as a digital identifier, which serves to identify corresponding wallet card data stored in the cloud.
  • a data token such as a digital identifier
  • Braintree's Venmo payment system which "vaults" the credit card details in a central repository.
  • Known data security techniques are used to protect the exchange of information from the cloud to the retailer's POS system (or to whatever of the parties in the Fig. 5 transaction system first receives the true card details).
  • Token-based systems make it easy for a user to handle loss or theft of the smartphone.
  • the user With a single authenticated communication to the credentials vault, the user can disable all further use of the payment cards from the missing phone. (The authenticated user can similarly revoke the public/private key pair associated with user through the phone's hardware ID, if same is used.)
  • After the user has obtained a replacement phone its hardware ID is communicated to the vault, and is associated with the user's collection of payment cards.
  • a new public/private key pair can be issued based on the new phone's hardware ID, and registered to the user with the certificate authority.
  • the vault can download artwork for all of the virtual cards in the user's collection to the new phone. Thereafter, the new phone can continue use of all of the cards as before.
  • the artwork representing the wallet cards is for the artwork representing the wallet cards to be generic, without any personalized identification (e.g., no name or account number).
  • any personalized identification e.g., no name or account number.
  • the virtual card data stored on the phone is logically- bound to the phone via the device ID, so that such data is not usable except on that phone. If the phone is lost or stolen, the issuer can be notified to revoke that card data and issue replacement data for installation on a replacement phone.
  • card data can be revoked remotely in a lost or stolen phone, using the iCloud Find My iPhone technology popularized by the Apple iPhone for remotely locking or wiping a phone.
  • a POS system makes a context-based assessment using information conveyed from the smartphone (e.g., optically conveyed from its display).
  • the roles can be reversed.
  • the POS terminal can convey context information to the smartphone, which makes an assessment using context information it determines itself.
  • Some systems use both approaches, with the smartphone testing the POS terminal, and the POS terminal testing the smartphone. Only if both tests conclude satisfactorily does a transaction proceed.
  • the steganographic data-carrying payload capacity of low resolution artwork is on the order of 50-100 bits per square inch.
  • high resolution displays of the sort now proliferating on smartphones (e.g., the Apple Retina display)
  • much higher data densities can reliably be achieved.
  • Still greater data capacity can be provided by encoding static artwork with a steganographic movie of hidden data, e.g., with new information encoded every tenth of a second. Using such techniques, payloads in the thousands of bits can be steganographically conveyed.
  • SIFT-based approaches for image recognition can also be employed (e.g., as detailed in patent 6,711,293).
  • SURF and ORB are more recent enhancements to SIFT.
  • Applicant's other work that is relevant to the present technology includes that detailed in patent publications US 2011-0212717 Al, US 2011-0161076 Al, US 2012-0284012 Al, US 2012-0046071 Al, US 2012-0214515 Al, and in pending applications 13/651,182, filed October 12, 2012 and 61/745,501, filed December 21, 2012.
  • a user may employ a smartphone to browse the web site of an online merchant, and add items to a shopping cart.
  • the merchant may have a dedicated app to facilitate such shopping (e.g., as EBay and Amazon do).
  • the user invokes the payment module software, causing one of the depicted interfaces (e.g., Fig. 1 or Fig. 10A) to be presented for user selection of the desired payment card.
  • an app may have a graphical control for selection by the user to activate the payment module. The user then flips through the available cards and taps one to complete the purchase.
  • the payment module determines the device context from which it was invoked (e.g., the
  • Amazon app or a Safari browser with a Land's End shopping cart
  • establishes a secure session to finalize the payment to the corresponding vendor, with the user-selected card.
  • various digital data protocols can be employed to secure the transaction.
  • optical communication with the cooperating system is not used. Instead, data is exchanged with the remote system by digital communications, e.g., using a 4G network to the internet, etc.
  • interfaces are illustrative only. In commercial implementation, it is expected that different forms of interface will probably be used, based on the demands and constraints of the particular application.
  • One alternative form of interface is one in which a virtual representation of a wallet card is dragged and dropped onto an item displayed on-screen that is to be purchased, or is dragged/dropped onto a displayed form that then auto-completes with textual particulars (cardholder name, billing address, card number, etc.) corresponding to the selected card.
  • Such forms of interaction may be particularly favored when using desktop and laptop computers.
  • Such virtual cards are also useful in self-service kiosks and other transactions.
  • An example is checking into a hotel. While hotels routinely employ human staff to check-in guests, they do so not solely to be hospitable. Such human interaction also serves a security purpose - providing an exchange by which guests can be informally vetted, e.g., to confirm that their stated identity is bona fide.
  • the present technology allows such vetting to be conducted in a far more rigorous manner. Many weary travelers would be pleased to check- in via a kiosk (presenting payment card and loyalty card credentials, and receiving a mag stripe-encoded, or RFID-based, room key in return), especially if it spared them a final delay in the day's travel, waiting for a human receptionist.
  • air travel can be made more secure by authenticating travelers using the technologies detailed herein, rather than relying on document inspection by a bleary-eyed human worker at shift's end. Boarding passes can similarly be made more secure by including such documents in the virtual wallet, and authenticating their validity using the presently-detailed techniques.
  • the relationship between the images was due to common geography and a common interval of time (a vacation trip to Napa).
  • the relationship can be of other sorts, such as person-centric or thing-centric.
  • the reference image may be a close-up of a pair of boots worn by a friend of the user, and the related candidate images can be face shots of that friend. (Dummy images can be face shots of strangers.)
  • Embodiments that presented information for user review or challenge on the smartphone screen, and/or solicited user response via the smartphone keypad or touch screen can instead be practiced otherwise.
  • information can be presented to the user on a different display, such as on a point of sale terminal display. Or it can be posed to the user verbally, as by a checkout clerk.
  • the user's response can be entered on a device different than the smartphone (e.g., a keypad at a checkout terminal), or the user may simply voice a responsive answer, for capture by a POS system microphone.
  • spectrum-based analysis of signals can be performed by filter banks, or by transforming the signal into the Fourier domain, where it is characterized by its spectral components.
  • security checks can be posed to the user at various times in the process, e.g., when the phone is awakened, when the payment app starts, when a card is selected, when payment is finalized, etc.
  • the check may seek to authenticate the user, the user device, a computer with which the device is communicating, etc.
  • the check may be required and/or performed by software in the device, or by software in a cooperating system. In addition to PIN and password approaches, these can include checks based on user biometrics, such as voice recognition and fingerprint recognition.
  • a screen-side camera on the user's smartphone captures an image of the user's face, and checks its features against stored reference features for the authorized user to confirm the phone is not being used by someone else.
  • Another form of check is the user's custody of a required physical token (e.g., a particular car key), etc.
  • Location information (e.g., GPS, cell tower triangulation, etc.) can also be utilized to confirm placement of the associated mobile device within proximity of the cooperating device. High confidence on location can be achieved by relying on network-provided location mechanism from companies such as Locaid, that are not susceptible to application hacking on the mobile device (enabled by unlocking the device or otherwise.)
  • a report of the failed transaction can be sent to the authorized user or other recipient.
  • a report e.g., by email or telephone, can include the location of the phone when the transaction failed, as determined by a location- sensing module in the phone (e.g., a GPS system).
  • a plastic chip card can be equipped with one or more MEMS sensors, and these can be used to generate context-dependent session keys, which can then be used in payment transactions in the manners described above in connection with smartphones.
  • plastic cards can also be useful in enrolling virtual cards in a smartphone wallet.
  • One particular such technology employs interaction between printable conductive inks (e.g., of metal oxides), and the capacitive touch screens commonly used on smartphones and tablets.
  • printable conductive inks e.g., of metal oxides
  • the touch screen senses the pattern defined by the ink and can respond accordingly.
  • Loading the card into the digital wallet can involve placing the mobile wallet software in an appropriate mode (e.g., "ingest"), after optional authentication has been completed. The user then places the physical card on the smartphone display. The use of conductive inks on the card serves to identify the card to the mobile device. The user can then lift the card off the display, leaving a virtualized representation of the card on the display to be subsequently stored in the wallet, with the opportunity to add additional metadata to facilitate transactions or preferences (PIN ' s , priority, etc . ) .
  • an appropriate mode e.g., "ingest”
  • the user then places the physical card on the smartphone display.
  • the use of conductive inks on the card serves to identify the card to the mobile device.
  • the user can then lift the card off the display, leaving a virtualized representation of the card on the display to be subsequently stored in the wallet, with the opportunity to add additional metadata to facilitate transactions or preferences (PIN ' s , priority, etc . ) .
  • Such physical item-based interaction with touch screens can also be used, e.g., during a challenge-response stage of a transaction.
  • a cooperating device may issue a challenge through the touch- screen on the mobile device as an alternative to (or in addition to) audio, image, wireless, or other challenge mechanisms.
  • a user places a smartphone screen-down on a reading device (similar to reading a digital boarding-pass at TSA check-points).
  • the cooperating device would have a static or dynamic electrical interconnect that could be used to simulate a multi-touch events on the mobile device.
  • the mobile device can use the challenge (presented as a touch event) to inform the transaction and respond appropriately to the cooperating device.
  • smartphones include the Apple iPhone 5; smartphones following Google's Android specification (e.g., the Galaxy S III phone, manufactured by Samsung, and the Motorola Droid Razr HD Maxx phone), and Windows 8 mobile phones (e.g., the Nokia Lumia 920).
  • Google's Android specification e.g., the Galaxy S III phone, manufactured by Samsung, and the Motorola Droid Razr HD Maxx phone
  • Windows 8 mobile phones e.g., the Nokia Lumia 920.
  • Apple iPhone including its touch interface, are provided in Apple's published patent application 20080174570.
  • each includes one or more processors, one or more memories (e.g. RAM), storage (e.g., a disk or flash memory), a user interface (which may include, e.g., a keypad, a TFT LCD or OLED display screen, touch or other gesture sensors, a camera or other optical sensor, a compass sensor, a 3D magnetometer, a 3-axis accelerometer, a 3-axis gyroscope, one or more microphones, etc., together with software instructions for providing a graphical user interface), interconnections between these elements (e.g., buses), and an interface for communicating with other devices (which may be wireless, such as GSM, 3G, 4G, CDMA, WiFi, WiMax, Zigbee or Bluetooth, and/or wired, such as through an Ethernet local area network, a T-l internet connection, etc.).
  • memories e.g. RAM
  • storage e.g., a disk or flash memory
  • a user interface which may include, e.g.,
  • the processes and system components detailed in this specification may be implemented as instructions for computing devices, including general purpose processor instructions for a variety of programmable processors, including microprocessors (e.g., the Intel Atom, the ARM A5, the nVidia Tegra 4, and the Qualcomm Snapdragon), graphics processing units (GPUs, such as the nVidia Tegra APX 2600, and the Adreno 330 - part of the Qualcomm Qualcomm Snapdragon processor), and digital signal processors (e.g., the Texas Instruments TMS320 series devices and OMAP series devices), etc.
  • microprocessors e.g., the Intel Atom, the ARM A5, the nVidia Tegra 4, and the Qualcomm Snapdragon
  • GPUs such as the nVidia Tegra APX 2600, and the Adreno 330 - part of the Qualcomm Snapdragon processor
  • digital signal processors e.g., the Texas Instruments TMS320 series devices and OMAP series devices
  • processor circuitry including programmable logic devices, field programmable gate arrays (e.g., the Xilinx Virtex series devices), field programmable object arrays, and application specific circuits - including digital, analog and mixed analog/digital circuitry.
  • Execution of the instructions can be distributed among processors and/or made parallel across processors within a device or across a network of devices. Processing of content signal data may also be distributed among different processor and memory devices.
  • “Cloud” computing resources can be used as well. References to "processors,” “modules” or “components” should be understood to refer to functionality, rather than requiring a particular form of implementation.
  • Known browser software, communications software, and media processing software can be adapted for use in implementing the present technology.
  • Software and hardware configuration data/instructions are commonly stored as instructions in one or more data structures conveyed by tangible media, such as magnetic or optical discs, memory cards, ROM, etc., which may be accessed across a network.
  • Some embodiments may be implemented as embedded systems -special purpose computer systems in which operating system software and application software are indistinguishable to the user (e.g., as is commonly the case in basic cell phones).
  • the functionality detailed in this specification can be implemented in operating system software, application software and/or as embedded system software.
  • Microelectromechanical Systems Most of these involve tiny moving parts. Such components with moving parts may be termed motive-mechanical systems.

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Finance (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Cash Registers Or Receiving Machines (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a smartphone-based virtual wallet, that manages payment options available to a user. One claim recites a portable device comprising: a touch screen display; a microphone for capturing ambient audio; memory for storing an image; and one or more processors. The one or more processors are configured for: generating copies of the stored image; obtaining a payload corresponding to financial information; providing the payload to an erasure code generator, in which the erasure code generator produces a plurality of outputs; embedding one of the plurality of outputs in a copy of the stored image and proceeding with embedding until each of the plurality of outputs is so embedded in a copy of the stored image, in which the embedding utilizes digital watermarking; causing the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction. A great variety of other features, arrangements, combinations and claims are also detailed.

Description

METHODS AND ARRANGEMENTS FOR
SMARTPHONE PAYMENTS AND TRANSACTIONS
Related Application Data
In the United States: The present application in a continuation in part of US Patent
Application No. 14/180,277, filed February 13, 2014, which claims the benefit of US Patent Application No. 61/938,673, filed February 11, 2014. The present application is also a continuation in part of US Patent Application No. 14/074,072, filed November 7, 2013, which claims the benefit of US Provisional Application Nos. 61/825,059, filed May 19, 2013. The 14/074,072 application is a continuation in part of US Patent Application No. 13/873,117, filed April 29, 2013, which is a continuation in part of US Patent No. 13/792,764, filed March 11, 2013, which claims the benefit of US Provisional Patent Application No. 61/769,701, filed February 26, 2013.
Technical Field
The present technology concerns, e.g., portable devices such as smartphones, and their use in making secure payments or facilitating transactions.
Background and Introduction to the Technology
Desirably, shoppers should be able to select from among plural different credit cards when making purchases, and not be tied to a single payment service. Having a variety of credit card payment options provides a variety of advantages.
For example, some credit card providers offer promotions that make spending on one card more attractive than another (e.g., double-miles on your Alaska Airlines Visa card for gas and grocery purchases made during February). Other promotions sometime include a lump-sum award of miles for new account holders after a threshold charge total has been reached (e.g., get 50,000 miles on your new CapitalOne Visa card after you've made $5,000 of purchases within the first five months.) At still other times, a shopper may be working to accumulate purchases on one particular card in order to reach a desired reward level (e.g., reaching 50,000 miles to qualify for a Delta ticket to Europe). The ability to easily select a desired card from among an assortment of cards is a feature lacking in many existing mobile payment systems. The legacy physical cards that embody the service provider brands and their capabilities are expensive to produce and have security weakness that can be mitigated in mobile payment systems. The look, feel, and user interfaces for physical cards are familiar and well understood. Existing mobile payments solutions involve numerous changes and new learning to operate.
In accordance with one aspect of the present technology, a smartphone programmed with a virtual wallet provides a user interface to present a wallet of virtual credit cards from which a user can pick when making a purchase. Data is conveyed optically from the phone to a cooperating system, such as a point of sale terminal or another smartphone. Preferably, the phone containing the virtual cards presents a graphical illustration of the selected card on the screen. Hidden in this graphical illustration (i.e., steganographically encoded) is transaction data. This transaction data may provide information about the selected card, and may also provide context data used to create a session key for security. Of course, a virtual wallet may receive payments, credits and rewards, as well as initiate payments.
Through use of the present technology, merchants can obtain the digital security advantages associated with "chip card"-based payment systems, without investing in interface hardware that has no other use, using virtual cards that have no costs of manufacture and distribution. The technology is secure, easy, economical, and reliable.
The foregoing and other features and advantages of the present technology will be more readily apparent from the following detailed description, which proceeds with reference to the accompanying drawings.
Brief Description of the Drawings
Figs. 1 and 2 show a fliptych user interface used in certain embodiments to allow a user to select a desired card from a virtual wallet.
Figs. 3A and 3B show alternative card selection user interfaces.
Fig. 4A shows artwork for a selected card, steganographically encoded with card and authentication information, displayed on a smartphone screen for optical sensing by a cooperating system. Fig. 4B is similar to Fig. 4A, but uses overt machine readable encoding (i.e., a barcode) instead of steganographic encoding, to optically convey information to the cooperating system.
Fig. 5 illustrates a common type of credit card transaction processing.
Fig. 6 shows a block diagram of a system in which a user's mobile device optically communicates with a cooperating system.
Fig. 7 is a flow chart detailing acts of an illustrative method.
Figs. 8 and 9 show screenshots of a user interface for selecting and presenting two cards to a vendor.
Figs. 10A and 10B show screenshots of an alternative user interface for selecting and presenting multiple cards to a vendor.
Fig. IOC illustrates how a payment can be split between two payment cards, in accordance with one aspect of the present technology.
Fig. 11 shows a payment user interface that presents a tally of items for purchase together with payment card artwork, and also provides for user signature.
Figs. 12A-12D show how checkout tallies can be customized per user preference.
Figs. 13A-13C show how authentication can employ steganographically-conveyed context data, an anti-phishing mutual validation system, and signature collection - all for increased security.
Figs. 14 and 15 show an authentication arrangement using photographs earlier captured by the user and stored on the smartphone.
Fig. 16 is a diagram showing a payload coding and transmission scheme.
Figs. 17A and 17B are diagrams showing communication pathways.
Detailed Description
The present technology has broad applicability, but necessarily is described by reference to a limited number of embodiments and applications. The reader should understand that this technology can be employed in various other forms - many quite different than the arrangements detailed in the following discussion.
One aspect of the present technology concerns payment technologies, including auctions to determine which financial vendor will facilitate a transaction. A few particular embodiments are described below, from which various features and advantages will become apparent. One particular method employs a user's portable device, such as a smartphone. As is familiar, such devices include a variety of components, e.g. a touch screen display, a processor, a memory, various sensor modules, etc.
Stored in the memory is an electronic payment module comprising software instructions that cause the device to present a user interface (UI) on the display. This electronic payment module (and/or a UI provided by such) is sometimes referred to herein as a "virtual wallet". One such user interface is shown in Fig. 1. The depicted user interface shows graphical
representations of plural different cards of the sort typically carried in a user's wallet, e.g., credit cards, shopping loyalty cards, frequent flier membership cards, etc. ("wallet cards"). The software enables the user to scroll through the collection of cards and select one or more for use in a payment transaction, using a fliptych arrangement. (Fliptych is the generic name for the style of interface popularized by Apple under the name "Cover Flow.") As earlier noted, it is advantageous for a shopper to be able to choose different of the displayed payment cards at different times, and not be virtually tied to a single payment service.
In the illustrated embodiment, after the user has scrolled to a desired card (a Visa card in
Fig. 1), it is selected for use in the transaction by a user signal, such as a single-tap on the touch screen. (A double-tap causes the depicted card to virtually flip-over and reveal, on its back side, information about recent account usage and available credit.)
A great variety of other user interface styles can be used for selecting from a virtual wallet of cards. Fig. 3 A shows another form of UI - a scrollable display of thumbnails. This UI illustrates that representations of cards other than faithful card depictions can be employed. (Note the logo, rather than the card image, to represent the MasterCard payment service).
Still another alternative UI for card selection is that employed by Apple's Passbook software, shown in Fig. 3B. (The Passbook app is an organizer for passes such as movie tickets, plane and train boarding passes, gift cards, coupons, etc.)
After the user has selected a payment card, the device may perform a user security check - if required by the card issuer or by stored profile data configured by the user. One security check is entry of a PIN or password, although there are many others.
The illustrative transaction method further involves generating context-based
authentication data using data from one or more smartphone sensors, as discussed more fully below. This authentication data serves to assure the cooperating system that the smartphone is legitimate and is not, e.g., a fraudulent "replay attack" of the system.
After the security check (if any), and generation of the context-based authentication data, the smartphone displays corresponding artwork on its display, as shown in Fig. 4A. This artwork visually indicates the selected payment service, thereby permitting the user to quickly check that the correct payment card has been selected. The card number, a logo distinctive of the selected payment service (e.g., an American Express, Visa or MasterCard logo) and/or card issuer (e.g., US Bank, Bank of America) can be included in the artwork, for viewing by the user.
While the smartphone display shown in Fig. 4A indicates the selected payment service, it also includes the payment service account data (e.g., account number, owner name, country code, and card expiration date), as well as the context-based authentication data. This information is not evident in the Fig. 4A artwork because it is hidden, using steganographic encoding (digital watermarking). However, such information can be decoded from the artwork by a corresponding (digital watermark) detector. Alternatively, such information can be conveyed otherwise, such as by other forms of machine-readable encoding (e.g., the barcode shown in Fig. 4B).
The user shows the artwork on the phone display to a sensor (e.g., a camera) of a cooperating system, such as a point of sale (POS) terminal, or a clerk's portable device, which captures one or more frames of imagery depicting the display. In one particular case the user holds the smartphone in front of a fixed camera, such as at a self-checkout terminal. In another, a POS terminal camera, or a smartphone camera, is positioned (e.g., by a checkout clerk) so as to capture an image of the smartphone screen. In still another, the user puts the smartphone, display facing up, on a conveyor of a grocery checkout, where it is imaged by the same camera(s) that is used to identify products for checkout. In all such arrangements, information is conveyed optically from the user device to the cooperating system. (Related technology is detailed in US 2013-0223673 Al.)
The cooperating system decodes the account data and authentication data from the captured imagery. The transaction is next security-checked by use of the authentication data. Corresponding transaction information is then forwarded to the merchant's bank for processing. From this point on, the payment transaction may proceed in the conventional manner. (Fig. 5 illustrates a credit card approval process for a typical transaction.) Fig. 6 shows some of the hardware elements involved in this embodiment, namely a user's smartphone, and a cooperating system. These elements are depicted as having identical components (which may be the case, e.g., if the cooperating system is another smartphone). The dashed lines illustrate that the camera of the cooperating system captures imagery from the display of the user smartphone.
Fig. 7 summarizes a few aspects of the above-described embodiment in flow chart form.
The authentication data used in the detailed embodiment can be of various types, and can serve various roles, as detailed in the following discussion.
A security vulnerability of many systems is the so-called "replay attack." In this scenario, a perpetrator collects data from a valid transaction, and later re -uses it to fraudulently make a second transaction. In the present case, if a perpetrator obtained imagery captured by a POS terminal, e.g., depicting the Fig. 4A virtual payment card of a user, then this same imagery might later be employed to mimic presentation of a valid payment card for any number of further transactions. (A simple case would be the perpetrator printing a captured image of the Fig. 4A screen display, and presenting the printed picture to a camera at a self-service checkout terminal to "pay" for merchandise.)
The authentication data of the present system defeats this type of attack. The
authentication data is of a character that naturally changes from transaction to transaction. A simple example is time or data. If this information is encoded in the image, the cooperating system can check that the decoded information matches its own assessment of the time/date.
As sensors have proliferated in smartphones, a great variety of other authentication data can be employed. For example, some smartphones now include barometric pressure sensors. The barometric pressure currently sensed by the smartphone sensor can be among the data provided from the smartphone display to the cooperating system. The cooperating system can check a barometric sensor of its own, and confirm that the received information matches within some margin of error, e.g., 1 millibar. Temperature is another atmospheric parameter than can be used in this fashion.
Other authentication data concern the pose and/or motion of the smartphone.
Smartphones are now conventionally equipped with a tri-axis magnetometer (compass), a tri-axis accelerometer and/or a tri-axis gyroscope. Data from these sensors allow the smartphone to characterize its position and motion, which information can be encoded in the displayed artwork. The cooperating system can analyze its captured imagery of the smartphone to make its own assessment of these data.
For example, in a supermarket context, a POS terminal may analyze camera data to determine that the shopper's camera is moving 1 foot per second (i.e., on a moving conveyor), and is in a pose with its screen facing straight up, with its top orientated towards a compass direction of 322 degrees. If the authentication data decoded from the artwork displayed on the camera screen does not match this pose/motion data observed by the POS terminal, then something is awry and the transaction is refused.
Another form of authentication data is information derived from the audio environment. A sample of ambient audio can be sensed by the smartphone microphone and processed, e.g., to classify it by type, or to decode an ambient digital watermark, or to generate an audio fingerprint. An exemplary audio fingerprint may be generated by sensing the audio over a one second interval and determining the audio power in nine linear or logarithmic bands spanning 300 - 3000 Hz (e.g., , 300-387 Hz, 387-500 Hz, 500-646 Hz, 646-835 Hz, 835-1078 Hz, 1078-1392 Hz, 1392-1798 Hz, 1798-2323 Hz, and 2323-3000 Hz). An eight bit fingerprint is derived from this series of data. The first bit is a "1" if the first band (300-387 Hz) has more energy than the band next-above (387-500Hz); else the first bit is a "0." And so forth up through the eighth bit (which is a "1" if the eighth band (1798-2323 Hz) has more energy than the band next-above (2323-3000 Hz).
The POS terminal can similarly sample the audio environment, and compute its own fingerprint information. This information is then compared with that communicated from the user's smartphone, and checked for correspondence. (The POS terminal can repeatedly compute an audio fingerprint for successive one second sample intervals, and check the received data against the last several computed fingerprints for a match within an error threshold, such as a Euclidean distance.)
In some implementations, the POS terminal may emit a short burst of tones - simultaneously or sequentially. The smartphone microphone senses these tones, and
communicates corresponding information back to the POS terminal, where a match assessment is made. (In the case of a sequence of tones, a sequence of audio fingerprints may be
communicated back.) By such arrangement, the POS terminal can influence or dictate, e.g., a fingerprint value that should reported back from the smartphone. This is a form of challenge -response authentication. The POS terminal issues a challenge (e.g., a particular combination or sequence of tones), and the smartphone must respond with a response that varies in accordance with the challenge. The response from the smartphone is checked against that expected by the POS terminal.
Relatedly, information from the visual environment can be used as the basis for authentication data. For example, the smartphone may be held to face towards the camera of a POS terminal. A collection of colored LEDs may be positioned next to the camera of the POS terminal, and may be controlled by the POS processor to shine colored light towards the smartphone. In one transaction the POS system may illuminate a blue LED. In a next transaction it may illuminate an orange LED. The smartphone senses the color illumination from its camera (i.e., the smartphone camera on the front of the device, adjacent the display screen), and encodes this information in the artwork displayed on the phone screen. The POS terminal checks the color information reported from the smartphone (via the encoded artwork) with information about the color of LED illuminated for the transaction, to check for correspondence.
Naturally, more complex arrangements can be used, including some in which different
LEDs are activated in a sequence to emit a series of colors that varies over time. This time- varying information can be reported back via the displayed artwork - either over time (e.g., the artwork displayed by the smartphone changes (steganographically) in response to each change in LED color), or the smartphone can process the sequence of different colors into a single datum. For example, the POS terminal may be capable of emitting ten different colors of light, and it issues a sequence of three of these colors - each for 100 milliseconds, in a repeating pattern. The smartphone senses the sequence, and then reports back a three digit decimal number - each digit representing one of the colors. The POS checks the received number to confirm that the three digits correspond to the three colors of illumination being presented, and that they were sensed in the correct order.
In like fashion, other time-varying authentication data can be similarly sensed by the smartphone and reported back to the cooperating system as authentication data.
All of the above types of authentication data are regarded as context data - providing information reporting context as sensed by the smartphone.
Combinations of the above-described types of authentication data - as well as others - can be used. It will be understood that use of authentication data as described above allows the risk of a replay attack to be engineered down to virtually zero.
Not only does the authentication data serve to defeat replay attacks, it can also be used to secure the payment card information against eavesdropping (e.g., a form of "man-in-the-middle" attack). Consider a perpetrator in a grocery checkout who uses a smartphone to capture an image of a smartphone of a person ahead in line, when the latter smartphone is presenting the Fig. 4B display that includes a barcode with payment card information. The perpetrator may later hack the barcode to extract the payment card information, and use that payment card data to make fraudulent charges.
To defeat such threat, the information encoded in the displayed artwork desirably is encrypted using a key. This key can be based on the authentication data. The smartphone presenting the information can derive the key from its sensed context data (e.g., audio, imagery, pose, motion, environment, etc.), yielding a context-dependent session key. The cooperating POS system makes a parallel assessment based on its sensed context data, from which it derives a matching session key. The authentication data thus is used to create a (context-dependent) secure private channel through which information is conveyed between the smartphone and the POS system.
There are many forms of encryption that can be employed. A simple one is an exclusive- OR operation, by which bits of the message are XOR-d with bits of the key. The resulting encrypted data string is encoded in the artwork presented on the smartphone screen. The POS system recovers this encrypted data from captured imagery of the phone, and applies the same key, in the same XOR operation, to recover the bits of the original message.
More sophisticated implementations employ encryption algorithms such as DES, SHA1, MD5, etc.
Additional security can be provided by use of digital signature technology, which may be used by the POS system to provide for authentication (and non-repudiation) of the information received from the smartphone (and vice-versa, if desired).
In one such embodiment, information identifying the phone or user is conveyed from the phone to the POS system (e.g., via the encoded artwork displayed on the phone screen). This identifier can take various forms. One is the phone's IMEI (International Mobile Station
Equipment Identity) data - an identifier that uniquely identifies a phone. (The IMEI can be displayed on most phones by entering *#06# on the keypad.) Another is a phone's IMSI (International Mobile Subscriber Identity) data, which identifies the phone's SIM card. Still other identifiers can be derived using known device fingerprinting techniques - based on parameter data collected from the phone, which in the aggregate distinguishes that phone from others. (All such arrangements may be regarded as a hardware ID.)
This identifier can be conveyed from the phone to the POS system in encrypted form, e.g., using context-based authentication data as described above.
Upon receipt of the identifier, the POS system consults a registry (e.g., a certificate authority) to obtain a public key (of a public -private cryptographic key pair) associated with that identifier. This enables the phone to encrypt information it wishes to securely communicate to the POS system using the phone's (or user's) private key. (This key may be stored in the phone's memory.) Information that may be encrypted in this fashion includes the payment card data. The POS system uses the public key that it obtained from the certificate authority to decrypt this information. Because the communicated information is signed with a key that allows for its decryption using the public key obtained from the certificate authority, the information is known by the POS system to have originated from the identified phone/user. (The public/private key pairs may be issued by a bank or other party involved in the transaction processing. The same party, or another, may operate the certificate authority.) Once the POS system has determined the provenance of the information provided by the mobile phone, a secondary check can be made to determine if the card information provided is associated with the phone, creating a second layer of security for a would-be attacker to surmount (beyond registering a fraudulent phone within the system, they would also have to associate the copied card information for a replay attack with the fraudulent phone).
The context based authentication data can also be encrypted with the private key, and decoded with the corresponding public key obtained from the certificate authority. In this case, since context-based authentication data is encrypted with a key that is tied to the device (e.g., via an IMEI identifier through a certificate authority), then this authentication data is logically bound to both the context and the user device.
The use of physically unclonable functions (PUFs) can also be utilized to provide confidence that the observed optical event (imager of the cooperating device) has not been spoofed. These may include but are not limited to shot-noise and temporal noise of the camera, properties of the image processing pipeline (compression artifacts, tonal curves influenced by Auto White Balance or other operations), etc. In addition, properties of the display of the mobile device can be used for this same purpose, such as dead pixels or fluctuations of display brightness as a function of time or power.
(Patent 7,370,190 provides additional information about physically unclonable functions, and their uses - technology with which the artisan is presumed to be familiar.)
It will be recognized that prior art transactions with conventional credit cards, based on magnetic stripe data, offer none of the security and authentication benefits noted above. The technologies described herein reduce costs and space requirements at checkout by eliminating need for mag stripe readers or RFID terminals. While "chip card" arrangements (sometimes termed "smart cards") offer a variety of digital security techniques, they require specialized interface technology to exchange data with the chip - interface technology that has no other use. The just-described implementations, in contrast, make use of camera sensors that are
commonplace in smartphones and tablets, and that are being increasingly deployed by retailers to read barcodes during checkout. This means that the marginal cost of reading is software only, in that hardware reader requirements are consistent with industry trends towards image capture at retail checkout, thereby exploiting a resource available at no marginal cost to implementers of the present technology. Notably, the reader function could be implemented in hardware as well, if doing so would provide superior cost effectiveness. The same imager-based readers could read other indicia, such as QR codes, authenticate digitally- watermarked driver licenses, and OCR relevant text.
Similarly, the system is more economical than all magnetic stripe and RFID systems because no physical cards or chips are required. (This is a particular savings when contrasted with chip card systems, due to the microprocessors and gold-plated interfaces typically used in such cards.) Nor is there any cost associated with distributing cards, confirming their safe receipt, and attending to their activation. Instead, credentials are distributed by electronically sending a file of data corresponding to a wallet card - encrypted and digitally signed by the issuing bank - to the phone, and using that file data to add the card to the smartphone wallet. The installation and activation of the card can be tied to various unique aspects of the device and/or user characteristics, such as, for example, a hardware ID or a hash of user history or personal characteristics data. A still further advantage is that the present technology is helpful in alleviating piriformis syndrome. This syndrome involves inflammation of the sciatic nerve due to pressure in the gluteal/pelvic region. A common cause of such pressure is presence of a large wallet in a person's rear pocket, which displaces customary pelvic alignment when sitting. By removing physical cards from a user's wallet, the wallet's volume is reduced, reducing attendant compression of the sciatic nerve. Elimination of the wallet requirement also improves security and convenience of payment processing for users.
Presentation of Multiple Cards
The arrangements just-described involved presentation of a single card - a payment card.
Sometimes plural cards are useful. One example is where a merchant offers discounts on certain items to users who are enrolled in the merchant's loyalty program. Another is where an airline offers a discount on checked luggage fees to fliers who are members of its frequent flier program In accordance with a further aspect of the technology, the UI on payment module of the user's smartphone permits selection of two or more cards from the virtual wallet. One is a payment card, and the other may be a loyalty ("merchant") card. Data corresponding to both cards may be optically conveyed to the cooperating system via the artwork presented on the display of the user's smartphone.
Fig. 8 shows one such user interface. As before, the user flips through the deck of virtual wallet cards to find a first desired card. Instead of the user tapping the card for selection, a sweeping gesture is used to move the virtual card above the deck (as shown by the Visa card in Fig. 8), while the rest of the virtual deck slides down to make room. The user then continues flipping through the deck to locate a second card, which is selected by tapping. As a
consequence of these actions, the phone screen presents artwork representing both the selected payment card, and the other (merchant) card, as shown in Fig. 9.
As before, information encoded in the displayed artwork is sensed by a camera of a cooperating system, and is used in connection with a transaction. The payment card information may be encoded in the portion of the artwork corresponding to the payment card, and likewise with the merchant card information. Or information for both cards can be encoded throughout the displayed imagery (as can the authentication information). Fig. 10A shows another style of user interface permitting selection of multiple wallet cards. Here, thumbnails of different cards are organized by type along the right edge: payment cards, loyalty cards, gift and coupon cards, and cents-back cards. (Cents-back cards serve to round-up a transaction amount to a next increment (e.g., the next dollar), with the excess funds contributed to a charity.) This right area of the depicted UI is scrollable, to reveal any thumbnails that can't be presented in the available screen space.
Desirably, the thumbnails presented on the right side of the UI are ordered so that the card(s) that are most likely to be used in a given context are the most conspicuous (e.g., not partially occluded by other cards). For example, in a Safeway store (as determined by GPS data, cross-referenced against map data identifying what businesses are at what locations; or as indicated by a sensed audio signal - such as detailed in Shopkick's patent application US 2011- 0029370 Al), the Safeway loyalty card would be most readily available. Similarly, if a shopper historically tends to use a Visa card at the Safeway store (perhaps because the issuing bank issues triple miles for dollars spent at grocery stores), then the Visa card thumbnail would be positioned at a preferred location relative to the other payment card options. Forward chaining of inference can be used to predict which cards are most likely to be used in different situations.
To use this form of interface, the user slides thumbnails of selected cards towards the center of the screen where they expand and stack, as shown in Fig. 10B. The user may assemble a recipe of cards including a credit card, a pair of coupon cards, a gift card, a loyalty card, and a cents-back card, while the grocery clerk is scanning items. Once the desired deck of cards is assembled, the deck is single-tapped (or in another embodiment double-tapped) to indicate that the user' s selection is completed. The displayed artwork is again encoded with information, as described earlier, for optical reading by a cooperating system. As shown in Figs. 10A and 10B, the artwork can include a background pattern 102, and this background pattern can also be encoded (thereby expanding the payload size and/or increasing the encoding robustness).
A visual indicia can be presented on the screen indicating that the artwork has been steganographically-encoded, and is ready to present for payment. For example, after the user has tapped the stack, and the artwork has been encoded, dark or other distinctive borders can appear around the card depictions.
A user interface can also be employed to split charges between two payment cards. Both cards may be in the name of the same person, or cards from two persons may be used to split a charge. (One such example is a family in which a weekly allowance is issued to teens by deposits to a prepaid debit card. A parent may have such a debit card for a teen in their smartphone wallet, and may occasionally agree to split the costs of a purchase with the teen.)
As shown in Fig. IOC, the artwork presented in one such UI case includes a hybrid card - a graphic composed partly of artwork associated with one card, and partly of artwork associated with another card. At the junction of the two parts is a dark border, and a user interface feature 103 that can be touched by the user on the touch screen and slid right or left to apportion a charge between the two cards in a desired manner. The illustrated UI shows the split detailed in percentage (30%/70%), but a split detailed in dollars could alternatively, or additionally, be displayed.
Auctioning Transaction Privileges:
Consider a shopper who populates a shopping cart - either physical or virtual. The cart's total is determined and presented via a device user interface (UI). Stored in device memory is an electronic payment module (or "virtual wallet") comprising software instructions and/or libraries that cause the device to present the user interface (UI) on the display.
This particular user has many different payment options associated with her virtual wallet, e.g., various credit accounts, credit cards, BitCoin credit, store cards or rewards, PayPal account(s), checking and/or savings account(s), etc. The virtual wallet may also include, e.g., frequent flyer account information, reward program information, membership information, loyalty membership information, coupons, discount codes, rebates, etc.
The user may indicate through the UI that she is ready to check out and purchase the cart items. If the UI cooperates with a touchscreen interface the user may indicate by touching the screen, flipping through various screens, scrolling, checking boxes, selecting icons, etc. In response, an auction is launched to determine which financial vendor associated with her virtual wallet will facilitate the financial transaction. In other cases, a solicitation of offers is launched to gather offers from the financial vendors associated with her virtual wallet. The virtual wallet can launch the solicitation or auction in a number of ways.
For example, the virtual wallet can communicate with the various financial vendors associated with the user's different payment options. Cart total and contents, store and user location(s), user credit history, etc. can be forwarded to the different financial institutions to consider as they bid to facilitate the user's transaction. If the cart's total is $97.23, American Express may, for example, decide to offer a discount to the user if she uses her American Express account. With the discount the transaction total may now only cost the user, e.g., $92.37. American Express may decide to offer the discount in exchange for promotional or marketing opportunities, pushing targeted advertisements or providing other opportunities to the user during or after the transaction. Or American Express may have a discount arrangement with the store from which the user is shopping, e.g., Target or Amazon.com, and/or a discount arrangement for certain of the cart items. A portion of the discount can be passed along to the user. American Express may base a decision to bid - and the amount of any discount associated with such bid - on a number of factors, e.g., the user's credit history with their American
Express account, their overall credit history, a length of time since the user used the account, the user's past response to targeted advertising, agreements with retailers or distributors, the user's demographics, promotion or marketing opportunities to the user, etc.
During the auction another creditor, e.g., PayPal's BillMeLater, may decide based on the user's credit history that she is a solid risk. So BillMeLater low-balls the bid, offering a bargin- basement cost of $82.19 for the purchase, but BillMeLater couples their bid with the user's required acceptance to establish or increase a line of credit.
Another creditor may promise a discount + a certain number of reward or mileage points if the user makes selects them for the transaction. Still another may bid/offer an extended warranty if purchased with them.
The auction can be time-limited so bids must be submitted within a certain response time. In other cases, the user can be preapproved for certain deals or promotions based on her location, which will help reduce auctions time. For example, the virtual wallet may determine that the phone is currently located in Wal-Mart or Target. Location information can be determined from user input, e.g., entering into the virtual wallet - or selecting from a screen pull-down or flip through - that the user is currently shopping in Wal-Mart, GPS information (e.g., coupled with a search of GPS coordinates), environmental information sensed by the user device upon entering the store (e.g., image recognition from recent camera pictures, analyzing digitally watermarked audio playing in a store, calculating audio fingerprints of ambient audio, audio beacons like Apple's iBeacons, Wi-Fi information network, etc.), etc. The virtual wallet can start to solicit bids from financial vendors associated with the virtual wallet or user as soon as the virtual wallet determines that the user is in a retail establishment, even though the user has not finished populating their cart and are not located at checkout. Incoming bids may then be based on all or some of the above factors, e.g., credit history, promotion opportunities, available discounts, etc., and less on the actual cart contents.
The virtual wallet can also start an auction or solicit offers when the first (or other) item is added to the cart.
The virtual wallet can also receive pre-authorization or firm bids from financial vendors. For example, Bank of America may decide that they are offering to the user a 3% discount for all in-store purchases at Wal-Mart made during the upcoming weekend. The virtual wallet stores this information and can present the offer if and when the user finds herself in Wal-Mart. The pre-authorization may include or link to promotional opportunities to be displayed during or after purchase.
The user can select from the various bids to determine which financial vendor will facilitate her transaction. For example, a double tap on a graphic with the desired bid can initiate the transaction. The user can be prompted to confirm the transaction if desired.
The virtual wallet can be user-configured to present only those bids meeting certain criteria. For example, through a settings screen or user interface, the user may decide that she only wants to see and consider the top 2 or 3 bids with cash-only discounts; such a setting will result in the user interface only presenting such top bids. Or the user may be interested in mileage rewards, or credit opportunities; and these will be presented in the top bids. Or the user can decide NOT to be bothered with the decision and may select a "best-deal" mode where the virtual wallet selects a bid based on a plurality of factors including, e.g., deepest discount, best long term financing, and/or proximity to reward levels (e.g., the user only need 5000 more mileage points to qualify for a trip to Hawaii). Such factors may be weighted according to user preference and a top bid can be determined as one with the highest overall weighting. (E.g., 10 points if the bid includes the deepest discount, 1 if it's the least discount; 8 points if the bid includes free long-term financing, 1 if it doesn't; 5 points if the bid includes reward points, 0 if it doesn't; 10 points if the user has selected this payment option recently, 1 if they haven't; 9 points if the user has a low balance on the credit account, 0 if they are near their credit limit; etc., and/or other weighting schemes.) A virtual wallet may also be configured to track reward status. E.g., if a newly purchased TV is defective, and a user takes it back for a refund, a merchant may communicate with a virtual wallet (or a financial vendor represented in the virtual wallet) to issue a credit. The refund may result in reward points being pulled from a rewards account. This information may be reflected in the virtual wallet.
The virtual wallet may also communicate with a broker or intermediary service. The broker or intermediary service can aggregate information, vendor bids, pre-authorizations, promotions, advertising etc. and associate such with a user or user device. In operation, the virtual wallet communicates with the broker who communicates (and may generate themselves) various bids and promotion opportunities back to the virtual wallet.
Auctions associated with the virtual wallet are not limited to retail checkout locations. The virtual wallet can help find better deals on many other items and services.
For example, a user can prompt the virtual wallet that they need gas. This may cause the virtual wallet to launch a search, auction and/or solicitation for the best possible deals. The auction can consider the various cards and memberships that the user has in her wallet. For example, a user's wallet may include a Chevron rewards card and an American Express account. This information can be communicated to various financial vendors including Chevron and American Express (or their intermediaries). An incoming bid may be presented to the mobile device including additional gas points on the Chevron rewards card and/or a discount if the American Express card is used. If a local Chevron Station is running a promotion, such information can be communicated to the virtual wallet for presentation to the user as well.
In some cases, the virtual wallet can be configured to communicate some or all details about a bid to a competing financial vendor - making the auction even more transparent to participating vendors. A competing vendor may decide to alter their initial bid to sweeten the deal. For example, Shell may decide that they don't want to be outbid by Chevron, and they may send the virtual wallet a bid that is lower, includes more rewards, or otherwise try to seduce the user. Shell's response can be sent back to Chevron, or Chevron's intermediary, who may decide to sweeten their bid in response.
In some cases, the auction can be geographically constricted, e.g., only gas stations within a pre-determined number of miles from a user are considered for an auction. The virtual wallet can determine which stations meet this location criteria by cooperation with one of the many available software apps that determine such stations based on a user's location (e.g., Google Maps, GasBuddy, etc.). Once a station is chosen, the virtual wallet may launch mapping software on the mobile device, pass into the mapping software a winning station's address or GPS coordinates, so that the user can have step-by-step driving directions to the station.
Alternatively, the destination address, or the turn by turn instructions, can simply be passed to the control system of a self-driving vehicle, which can drive itself to the gas station, and complete the transaction.
Instead of a user prompting the virtual wallet that she needs gas, the virtual wallet may initiate an auction or solicitation based on other factors. For example, GPS coordinates may indicate that the user is located at or approaching a gas station. An auction may be launched based on such proximity information.
In many cases, cars are becoming smarter and smarter. Cars are already available with low fuel warnings, low tire pressure warnings, service engine warnings, etc. Such warnings may be communicated to the user's device (e.g., via a Bluetooth pairing between the car and mobile phone) and used by the virtual wallet to initiate an auction to provide the best deals to address the warning.
Of course, the virtual wallet need not completely reside on a user's smartphone. For example, components of such may be distributed to the cloud, or to other available devices for processing. In the above example, a virtual wallet may handoff direction to a car's onboard computer and let it do some or all of the direction. In other cases, a wallet shell resides on the cell phone. In this embodiment, the shell includes, e.g., graphic drivers and user interfaces to allow device display, user input and communication with a remote location. Storage of credit card information and other wallet contents are stored remotely, e.g., in the cloud.
A virtual wallet may cause a digital watermark detector (or fingerprint generator) to analyze background audio in a background collection mode. For example, once operating in this background mode a detector or generator may analyze audio accompanying radio, internet, TV, movies, all to decode watermarks (or calculates fingerprints) without requiring human invention. The audio may include watermarks (or be processed to yield fingerprints) that link to information associated with advertising, store promotional, coupons, etc. (Instead of audio, the background collection mode may capture video or still imagery; such video or imagery may be processed to yield information.) This information can be stored in the virtual wallet, e.g., according to store identifier, location, event, etc. In other embodiments, this information is stored in the cloud for access by the virtual wallet. Later, when the virtual wallet enters a store (or comes in proximity of a remote checkout terminal, e.g., a computer), the virtual wallet can receive location or retail information, e.g., included in a signal emanating from an iBeacon, audio source or captured from imagery provided by the store (e.g., an in-store display, poster, etc.). The virtual wallet may use received location or retail information to search through stored or previously encountered audio or video derived information. The virtual wallet can prompt the user if discounts, coupons, promotions are found, and may apply any such discounts/coupons at checkout. The virtual wallet may also access a store map or product in-store location to help the user navigate to those products for which the virtual wallet has discounts or coupons. These may correspond to previously encountered advertising which the wallet has collected or caused to be stored.
Message Payloads and More
Some embodiments benefit from using a relatively large payload (e.g., 500-2,500 bits) during a virtual wallet transaction. The payload can be carried in a digital watermark that is embedded in displayed imagery or video, encoded in hearing range audio, or transmitted using a high frequency audio channel. The payload may correspond with credit card or financial information (e.g., ISO/IEC 7813 information like track 1 and track 2 information), account information, loyalty information, etc. Payload information may be stored or generated locally on a smartphone, or the smartphone may query a remotely-located repository to obtain such. In some cases the remotely located repository provides a 1-time token which can be used for a single (sometimes application specific) transaction. In some cases a token replaces or is a proxy for a credit card or account number, and is conveyed as payload information. A receiving party
(e.g., a party receiving a payload) can transmit the 1-time token to a 3 rd party clearing house (which may or may not be the remotely located repository) to facilitate payment using the 1-time token. The 1-time token can be cryptographically associated with a user account or user payment.
Now consider encoded, displayed imagery. A user presents their portable device to a point of sale station which includes an optical reader or digital camera. In some cases the point of sale station is a portable device, e.g., like a smartphone, pad or tablet. The user's portable device displays digital watermarked imagery on the device's display for capture by the station's reader or camera. The displayed imagery can be a still image, e.g., an image or graphic representing a credit card, a picture of the family dog, an animation, etc. A virtual wallet can be configured to control the display of the image or graphic so that multiple frames (or versions) of the same still image or graphic are cycled on the display. Preferably, the displayed images appear as if they are collectively a static image, and not a video-like rendering. Each instance of the displayed image or graphic (or groups of images) carries a payload component. For example, a first displayed image carries a first payload component, a second displayed image carries a second payload component...and the nth-displayed image carries an nth-payload component (where n is an integer). Since the only change to each displayed image is a different payload component, which is generally hidden from human observation with digital watermarking, the displayed images appear static - as if they are collectively a single image - to a human observer of the smartphone display. A decoder, however, can be configured to analyze each separate image to decode the payload component located therein.
The payload components can take various forms. In a first embodiment, a relatively large payload is segmented or divided into various portions. The portions themselves can be used as the various components, or they can be processed for greater robustness, e.g., error correction encoded, and then used as the various payload components. For example, once the whole payload is segmented, a first portion is provided as the first payload component, which is embedded with digital watermarking in the first image for display, a second portion is provided as the second payload component, which is embedded with digital watermarking in a second image for display, and so on. Preferably, each of the various payload portions includes, is appended to include, or is otherwise associated or supplemented with a relative payload position or portion identifier. This will help identify the particular payload portion when reassembling the whole payload upon detection.
A watermark detector receives image data depicting a display (e.g., smartphone display) captured overtime. Capture of imagery can be synchronized with cycled, displayed images. The watermark detector analyzes captured images or video frames to detect digital watermarks hidden therein. A hidden digital watermark includes a payload component. In the above first embodiment, the payload component corresponds to a payload portion and carries or is accompanied by a portion identifier (e.g., 1 of 12, or 3 of 12, etc.). The watermark detector, or a processor associated with such detector, combines decoded payload components and attempts to reconstruct the whole payload. For example, the payload portions may need to simply be concatenated to yield the entire payload. Or, once concatenated, the payload may need to be decrypted or decoded. The detector or processor tracks the portion identifiers, and may prompt ongoing detection until all payload portions are successfully recovered. If the detector misses a payload component (e.g., 3 of 12), it preferably waits until that component is cycled back through the display and successful captured and decoded, or may direct communication with the display that it needs, e.g., payload component 3 of 12.
From a display side, if the whole payload is carried by 12 payload components, corresponding to 12 embedded image versions (each individual image version carrying one of the 12 payload components), then the 12 image versions can be repeatedly cycled through the display, e.g., for a predetermined time (e.g., 3-30 seconds) or until stopped by the user or point of sale station communicating a successful read back to the virtual wallet. If the display has a frame rate of 24 frames per second, then the 12 embedded image versions can be collectively cycled twice per second (or more or less depending on display frame rates).
In another embodiment of carrying a relatively large payload in displayed imagery, we present embodiments using signal coding techniques known as erasure codes and/or rateless codes. One example of these codes is the so-called "fountain codes." For example, see, e.g., MacKay, "Fountain codes," IEE Proc Commun 152(6): 1062-1068, December 2005. See also US Patent No. 7,721,184.
To quote MacKay, from the above referenced paper, "Abstract: Fountain codes are record-breaking sparse-graph codes for channels with erasures, such as the internet, where files are transmitted in multiple small packets, each of which is either received without error or not received. Standard file transfer protocols simply chop a file up into K packet sized pieces, then repeatedly transmit each packet until it is successfully received. A back channel is required for the transmitter to find out which packets need retransmitting. In contrast, fountain codes make packets that are random functions of the whole file. The transmitter sprays packets at the receiver without any knowledge of which packets are received. Once the receiver has received any N packets, where N is just slightly greater than the original file size K, the whole file can be recovered. In the paper random linear fountain codes, LT codes, and raptor codes are reviewed. ... 2. Fountain Codes. The computational costs of the best fountain codes are astonishingly small, scaling linearly with the file size. The encoder of a fountain code is a metaphorical fountain that produces an endless supply of water drops (encoded packets); let us say the original source file has a size of Kl bits, and each drop contains 1 encoded bits. Now, anyone who wishes to receive the encoded file holds a bucket under the fountain and collects drops until the number of drops in the bucket is a little larger than K. They can then recover the original file. Fountain codes are rateless in the sense that the number of encoded packets that can be generated from the source message is potentially limitless; and the number of encoded packets generated can be determined on the fly. Fountain codes are universal because they are simultaneously nearoptimal for every erasure channel. Regardless of the statistics of the erasure events on the channel, we can send as many encoded packets as are needed in order for the decoder to recover the source data. The source data can be decoded from any set of K0 encoded packets, for K0 slightly larger than K. Fountain codes can also have fantastically small encoding and decoding complexities."
One advantage of a fountain code is that a detector need not communicate anything back to a transmitter about which payload portions, if any, are missing. For example, fountain codes can transform a payload into an effectively large number of encoded data blobs (or components), such that the original payload can be reassemble given any subset of those data blobs, as long the same size, or a little more than the same size, of the original payload is recovered. This provides a "fountain" of encoded data; a receiver can reassemble the payload by catching enough "drops," regardless of which ones it gets and which ones it misses.
We can use erasure codes (e.g., fountain codes) to convey a relatively large payload for use with displayed imagery. For example, the relatively large payload can be presented to a fountain code encoder, which creates a plurality of encoded data blobs (e.g., encoded
components). In some cases, each encoded data blob is accompanied with an index or seed. The index or seed allows the decoder to use a complementary decoding procedure to reconstruct the payload. For example, the encoder and decoder may agree on a pseudo-random number generator (or an indexed-based matrix generator). In one example, the generator includes an nxn random bit non-singular matrix where n is the payload' s bit length. The matrix can be processed with a dot product of the payload which yields yn outputs. An index can be associated with each yn output, to allow reconstruction by the decoder. In another example, we can seed a generator with a randomly chosen index, and use that to pick a degree and set of source blocks. An encoded data blob is sent with the seed or index for that encoded block, and the decoder can use the same procedure to reconstruct the payload from received blobs/indexes. Another example is considered with reference to Fig. 16. Payload 170 is presented to a Fountain Code Generator 171. Of course, other types of erasure code generators may be used instead, e.g., Raptor Codes or LT codes (Luby Transform codes). The payload 170 can be a relatively large payload (e.g., in comparison to other, smaller digital watermarking payloads). Payload 170 preferably includes, e.g., 500-8k bits. (Raptor and LT codes may be helpful when using even larger payloads, e.g., greater than 8k bits.) One specific example is a payload including 880 bits. Payload 170 may include or may be appended to include additional error correction bits, e.g., CRC bits. Additional CRC bits can be added to the 880 bit payload example, e.g., 32 additional bits.
Fountain Code Generator 171 produces a plurality of coded outputs (or data blobs),
Yl ....YN, where N is an integer value. Data blob outputs are provided to a Digital Watermark Embedder 172. Digital Watermark Embedder 172 uses the data blob outputs as payloads to be respectively hidden in image versions (Il-IN). The term "image version" may correspond to a copy or buffered version of a static (or still) Image (I) 174 that the user (or virtual wallet) has selected to represent a financial account or credit card or the like. Instead of being a copy of a still image, an image version may correspond to a video frame or video segment. Digital Watermark Embedder 172 embeds a data blob (e.g., Yl) in an image version II and outputs such (resulting in watermarked image version Iwl) for display by Display 173. Digital Watermark Embedder 172 continues to embed data blobs in image version, e.g., Y2 in 12 and output (Iw2) for display, Y3 in 13 and output (Iw3) for display and so on. Parallel processing may be advantageously used to embed multiple image versions in parallel. In alternative arrangements, Digital Watermark Embedder 172 delegates embedding functions to other units. For example, Display 173 may include or cooperate with a GPU (graphics processing unit). Digital
Watermark Embedder 172 may determine watermark tweaks (or changes) corresponding to embedding an output data blob in an image version and pass that information onto the GPU, which introduces the changes in an image version. In other case Digital Watermark Embedder 172 may calculate a watermark title (e.g., a watermark signal representing an output data blob) can convey such to another unit like the GPU. The GPU may then consider other factors like a perceptual embedding map or human attention model and introduce the watermark title in an image version with consideration of the map or model. (In Fig. 16, it should be understood that the Fountain Code Generator 171, Digital Watermark Embedder 172 and image (I) may be housed and operated in a portable device like the smartphone which includes Display 173. In other configurations, a portable device hosting the Display 173 communicates with a remotely- located device that hosts the Fountain Code Generator 171, Digital Watermark Embedder 172 and/or Image 174.)
Embedded image versions Iwl ...Iwn may be stored or buffered for cycling for display on
Display 173. For example, if 24 image versions are embedded with data blobs, and if Display 173 has a frame rate of 24 frames per second, then the 24 embedded image versions can be collectively cycled once per second (each image version is shown for 1/24ώ of a second).
Embedded image versions can be repeatedly cycled through the display one after another, e.g., for a predetermined time (e.g., 5-10 seconds) or until stopped by the user or point of sale terminal. For example, the user or terminal may communicating a successful read to the virtual wallet which terminates display. To a human observer of the cycled images, it appears that a static image is being displayed since the changes in the different image versions are digital watermarking, which are generally imperceptible to the human eye. This can be referred to as a "static image display effect".
Returning to Fountain Code Generator 171, one configuration includes a non-singular random binary nxn matrix, where n is the payload' s bit length. So, for the above 880 bit payload (912 including CRC bits) example, a 912x912 matrix is provided. The matrix can be processed with a dot product of the payload (912 bits) to yields yl-yN outputs. Continuing this example, fountain code outputs each include, e.g., 120 bits. A matrix index can be combined with the outputs including, e.g., 5 additional bits per output. The index can be specifically associated with individual outputs yN, can be associated with a group of y outputs, and/or can be associated with the matrix itself. The 125 bits can be error protected, e.g., by appending CRC bits (e.g., 24 bits for a total output data blob YN bit count of 149 bits per data blob). Error protection can be provided by the Fountain Code Generator 171 or the Digital Watermark Embedder 172, or both. For a typical application, about 6-180 data blobs can be used to reconstruct a message. In the 880 bit payload example, if 32 output blobs are used, then 32 corresponding image versions (each individual image version having one of the 32 data blobs digitally watermarked therein) can be embedded in separate versions of the image for display on the smartphone as discussed above. Instead of operating on a bit by bit manner, the Fountain Code Generator 171 can be configured to operate on longer codes, such as with Galois Fields (e.g., GF(256)) discussed in US Patent Nos. 7,412,641, 7,971,129 and 8,006,160.
From a detector side, e.g., analyzing image data representing some or all of the embedded image versions Iwl-IwN displayed on the Display 173, constructing the payload can begin as soon as a data blob has been decoded from a digital watermark. That is, not all data blobs need to be recovered first before payload reconstruction is initiated with a corresponding erasure code decoder (e.g., in one above example, a corresponding non-singular matrix).
Of course, different payload sizes, error correction bit size and techniques, image version numbers, data blob outputs, intermediate outputs and erasure code generator configurations can be used. Thus, the above examples and embodiments are not intended to be limiting.
Additionally, a payload may be segmented prior to fountain code encoding, with each segment having a corresponding number of output blobs. And, other related coding schemes can be used with cycling imagery (including video frames) such as Raptor codes and LT codes.
And, of course, different watermark embedding strengths can be used. A relatively higher strength may affect visibility. To help offset visibility, we can use a human perceptibility map where an image is analyzed to find areas that will effectively hide a digital watermark and/or identify those areas which may result in visual artifacts if a digital watermark is hidden therein. A map can be created to avoid such poor hiding areas, or to embed in those areas at a relatively lower embedding strength. Calculating a perceptual map takes processing resources. To avoid calculating a map for each embedding instance of the same image, a map can be reused. For example, in the above Fig. 16 example, the Digital Watermark Embedder 172 may consult a perceptual map to help guide embedding. When using a still image, and since multiple versions of Image (I) 174 are being used, which each preferably include the same image content, a perceptual map can be calculated once, and then reused for each embedding of the image versions. In some cases, the map can be generated as soon as a user identifies an image to be used as transaction graphic, e.g., during registration or virtual wallet set up, which are prior to transactions.
Another way to avoid visual perceptibility of embedded watermarks is to vary embedding strengths based on timing or device sensor feedback. For example, a user may instruct their virtual wallet to display an image for optical sensing. The displayed, cycled images may be embedding with a relatively lower embedding strength for a predetermined time, e.g., the first 0- 3 seconds which may correspond to the average time it takes a user to present the smartphone display to an optical reader. Then, for a second time period, e.g., for the next 3-7 seconds, the watermark strength of the displayed, cycled images is pumped up to a relatively stronger level since the display will be pointed at the optical reader, away from human observation.
Instead of using predetermined time periods, the embedding strength may depend on device sensor feedback. For example, after initiating display of imagery, the smartphone may user gyroscope information to make embedding strength decisions. For example, after first movement (corresponding to positioning the display to an optical reader), the embedding strength may be increased, and after one or more movement detections, the embedding strength may be decreased (e.g., corresponding to movement away from the camera). Of course, such gyroscope movements can be analyzed to identify user tendencies, and the embedder can be trained to recognize such movements to optimize watermark embedding strength.
Some operating systems limit user accessibility to camera captured imagery to accommodate, e.g., post-processing. For example, user may only have access to 24-30 fps. In one embodiment, a watermark detector is given access to a higher frame rate, e.g., 70-120 frames per second. Watermark embedding and detection are synchronized such that digital
watermarking can be embedded so that it can only be read from this higher frame rate. In other cases, additional information is obtained from the high frame rate detection, while still embedding some information for detection from a lower, standard frame rate.
Some of the above embodiments discuss a virtual wallet operating on a smartphone to cause display of a relatively large payload. Our inventive techniques can be applied in a reverse manner, e.g., to a point of sale display which displays cycling imagery to a user's smartphone. A payload can be communicated from the point of sale to a smartphone' s virtual wallet. This may be used as a confirmation of a transaction, or it may be as a transaction identifier which can be communicated by the smartphone to a 3 rd party (e.g., a credit card vendor, a PayPal like service, etc.). The transaction identifier can be supplemented with account information by the virtual wallet to identify an account associated with the virtual wallet. The 3 rd party uses the transaction identifier and the account information to facilitate payment to the vendor. A confirmation of payment can be transmitted to the vender (e.g., from information included or associated with the transaction identifier) and/or virtual wallet. Some users may prefer this system since financial information is not transmitted from the user to the retailer, but from the retailer to the user, to the 3rd party.
In another embodiment, we use high frequency audio to convey a relatively large payload for use in a virtual wallet transaction. For example, smartphone includes a transmitter (e.g., a speaker). The transmitter emits high frequency audio to a receiver. The high frequency audio includes a relatively large payload. At a point of sale check out, the smartphone is positioned in proximity of a receiver at the point of sale location. High frequency audio is emitted from the smartphone, which is received by the point of sale receiver. The payload is decoded from the received audio, and the transaction proceeds. The high frequency audio encoding and
transmission techniques disclosed in Digimarc's application no. 14/054,492, filed October 15, 2013, can be used in these virtual wallet applications.
A high frequency (HF) audio channel or an audible audio channel can be used to establish bi-directional communication between a virtual wallet and a point of sale location. A financial transaction can proceed once communication is established. For example, a virtual wallet can cause its host smartphone to transmit a known high frequency audio message, e.g., the message is known to both the virtual wallet and to a receiver. The receiver determines signal errors or a measure of signal error and communicates such back to the smartphone. The return communication can use Bluetooth, high frequency audio, radio frequency or audible range audio, or the like. The virtual wallet uses this return error signal to adjust (e.g., increase or decrease), if needed, the level of error correction and/or signal strength for it next transmitted audio signal, e.g., when transmitting a payload. The payload may correspond to various information including account information, encrypted information and/or tokens as discussed above.
In another case, a point of sale receiver expects both captured audio + captured imagery to process or complete a financial transaction. A virtual wallet can cause imagery to be cycled on its display, as discussed above. A high frequency audio signal is generated to cooperate with presented imagery. For example, presented imagery may include financial credit card or account information, and the transmitted high frequency audio signal may include an associated PIN for the financial information, an encryption key to decrypt the imagery payload, or an expected hash of the imagery payload. The transaction can be conditioned on verifying an expected
correspondence between the audio and video/imagery information, or upon successful decryption of the information using a key provided in one of the channels. In other cases, a video or image watermark signal includes a key, PIN or hash that is associated with an audio signal payload. The point of sale receiver may request, e.g., through a high frequency audio channel, that the virtual wallet transmit the corresponding audio message once the imagery is successfully received. Of course, a transmitted audio signal (including, e.g., the pin, hash or key) may prompt a receiver to enable its camera to capture a to-be-presented display screen.
In another embodiment, HF audio is used to help ensure that communication is taking place between a point of sale terminal (POS) and a device within some given distance, say, e.g., 1-6 feet. Using a HF audio channel, the POS and a mobile device exchange public keys. The public keys help establish a secure protocol. But even with the exchange of keys, the POS terminal does not know how far away the mobile device is, so it might be possible to spoof one or the other of the mobile device or the POS.
To allow the POS to verify a distance between the two is within some range, a ranging protocol test, including three or more HF audio messages preferably takes place in the following manner:
1. The POS transmits a PN code, encrypted with its private key. An example length could be, e.g., 128 bits.
2. The mobile device decrypts the PN code using the public key of the POS.
3. The POS transmits a different PN code, this time unencrypted, and of the same length as the previous PN code.
4. Upon receipt of the second PN code, the mobile device calculates the XOR (or dot product or other combination) of the two PN codes and transmits the result back to the POS.
5. The POS receives the XOR'ed values, verifies them, and also verifies that the time delay between POS transmit and POS receive of the last two messages is less than the time required for sound to travel the expected distance (e.g., round trip of 6 feet) plus some nominal processing time. Processing time can be minimized by transmitting in full duplex mode between the device and POS.
By reversing the roles above, the device can also be assured that it is really
communicating with the nearby POS, and not a spoof node.
Another audio safeguard is to use directional speakers to convey the audio signal. For example, a parametric speaker using ultrasonic carrier waves to transmit audio to listeners with a focused beam of sound. Since the beam is focused, only receivers in front of the parametric speaker can adequately detect transmitted audio. SoundLazer in the United States provides example speakers.
Transaction Pathways There are many different communication pathways that can be used to facilitate a transaction. With reference to Fig. 17A, consider a point of sale terminal (POS) 180 and a mobile device 181. In a "forward" transaction - for example used with a self-checkout station in a grocery store - mobile device 181 communicates payload information to POS 180. This transaction may look like a conventional card payment transaction, e.g., as discussed in US Patent No. 8,099,368 (with reference to FIG. 1). We prefer to use, however, embedded image data or encoded audio data to communicate user payment information and additional data like date, time, geolocation, etc. A watermark payload for this transaction scenario is likely a relatively large payload, e.g., including account information, credit card or a proxy of such (e.g., 1-time token). The above payload encoding techniques (e.g., with erasure codes) can be effectively used in this forward transaction embodiment. The payload can be presented on a display of mobile device 181, communicated with a HF audio signal transmitted by mobile device 180, or a combination of both, e.g., as discussed above in the "Message Payloads and More" section. POS 180 receives the payload from mobile device 180 and communicates such to a transaction clearing house 182 (e.g., like a credit card processor, card issuer, etc.). The clearing house determines whether the payment is authorized and returns an authorization or denial back to POS 180.
Self-checkout presents a unique problem if the shopper is purchasing age-restricted items (e.g., alcohol). A virtual wallet may include a virtual representation of an identification document (ID). In some cases the virtual representation includes age information that may be validated by the POS - or a service cooperating with the POS - to determine whether the shopper is of a certain age. For example, the age information may include a cryptographic signature or data blob that can be processed by the POS (or sent to a remote service for further processing) to determine or verify the shopper's age. If alcohol is scanned during checkout the shopper can be prompted to present their virtual ID. The ID can be selected by the shopper via the mobile device 181 user interface (e.g., swiping screens until the ID graphic is found). Once found or selected the virtual ID can be displayed on the mobile devices display for presentation to the POS's camera. The virtual ID can communicate age information through digital watermarking embedded in a displayed image or in a graphical representation of a driver's license or other ID credential. In alternative embodiment, ID information is conveyed through an audio signal, e.g., a HF audio signal. In addition to determining age, the POS or cooperating service can also verify whether the credential is authentic as well.
Another "forward" transaction involves a medium to smaller payload. For example, when communicating a payload including specific account information (e.g., like a retailer's stored value accounts, account no, loyal card, etc.). This transaction may even involve less sophisticated cameras, e.g., still cameras or low frame per second capture cameras. This information can be embedded, perhaps, in a single image frame or over a few frames.
Alternatively, an audio signal, e.g., HF audio, is used to communicate the payload. Once received by a POS, the transaction is processed by communicating the payload to a web service or network based processor. In some cases, the POS does not decode a received payload but merely communicates it along to the web service or processor for them to decode. The POS awaits authorization to allow the transaction.
While the above pathways have envisioned a POS terminal, the present techniques work well with peer-2-peer devices. For example, the POS terminal may be embodied in a mobile device, equipped with a camera and microphone.
A "reverse" transaction pathway is discussed with reference to 17B. There,
communication is from POS 180 to mobile device 181. A reverse pathway may be even more secure by preventing user information to be communicated to merchants. In one example, POS includes a display screen on which digital watermarked information is displayed during checkout. The digital watermark information may include, e.g., transaction identifier, checkout station, merchant/payee identifier, cost, and/or additional data such as date, time, geolocation, etc. Mobile device 181 captures imagery of the display with its camera and analyzes such to detect the hidden digital watermarking information. The digital watermarking information is decoded and communicated, preferably along with user selected account or payment information stored in her virtual wallet, to a remote 3 rd party who facilitates the transaction. For example, the
3 rd party verifies the shopper account or payment information and determines whether to authorize the transaction. The authorization/denial can be communicated directly back to the POS 180 from the third party, or an authorization token can be transmitted back to the mobile device, which communicates such to the POS. The POS can analyze the token, or call a service to analyze such for them, to verify the authentication.
The 3rd party may prompt the user to confirm the transaction. For example, the 3rd party may provide a verification clue (e.g., a user preselected, arbitrary image) to the user to help ensure trust, provide the amount to be authorized (e.g., $88.17) and ask the user to click "yes" or "no" to authorize. The verification clue may have been selected or provided by the user during account registration. Receiving the verification clue from the 3 rd party provides another level of security. Instead of clicking a UI graphic box, the user may shake the phone in a predetermined manner to authorize or decline the transaction. The mobile device's gyroscope provides relative movement for the virtual wallet to interpret.
In an alternative embodiment involving a reverse pathway, a static watermarked image or audio source can be located at checkout and scanned or microphone captured by the mobile device to initiate payment in the cloud. The static watermarked image or audio may include information such as checkout station, merchant/payee identifier, retail location, etc. The mobile device decodes the digital watermarking to obtain the static information and combine this with user selected account or payment information from the virtual wallet, and communicates the combined information to the 3 rd party clearing house. The mobile device can also communicate a timestamp as well. The POS can also communicate the transaction amount, checkout station identifier, retailer identifier, etc. along with a timestamp to the 3rd party clearing house. The 3rd party clearing house marries the POS information with the mobile device information (e.g., by matching up retailer identifiers and timestamps) and determines whether to authorize the transaction. Like above, the mobile device can be prompted by the 3 rd party to confirm payment or authorization. Once authorized, the 3 rd party transmits an authorization code to the POS directly or through the mobile device.
We sometimes have used the terms "3rd party," "clearing house," and "3rd party clearing house" as entities that can help facilitate transactions. It should be realized that these terms may include one or more entities using multiple different and/or distributed systems. In some cases, the 3 rd party may be owned or operated by the owner of the POS terminal. Credential Lo in A virtual wallet can include information to facilitate system or physical access. The system may include, e.g., a mobile device, laptop, desktop, web service, remote database or cloud processor, communications network, etc. Instead of typing in a password, a user can select a card (e.g., a graphic) from their virtual wallet. When displayed on a mobile device display, the selected card (or plural displayed versions of the card) includes digital watermarking hidden therein. The watermarking conveys information to facilitate system access. The system includes a camera which captures imagery corresponding to the mobile device display. Captured imagery is analyzed to decode the digital watermarking information. The information is compared to stored, expected information to determine whether to allow access. The information or a portion of the information may have a cryptographic relationship with the stored, expected information. The virtual wallet may generate or receive a 1-time token which is time dependent. This 1-time token can be analyzed by the system (who has access to corresponding a key or token) to determine whether to allow access.
The virtual wallet may prompt for user input prior to displaying a selected card. For example, the user may be prompted to swipe a finger on a mobile devices fingerprint reader, or show an eye to a camera for retina detection, or enter a password or PIN.
The virtual wallet may cause the home screen, background or locked screen on a mobile device to include the system access digital watermark information. For example, a virtual wallet may include a setting to embed such screens or backgrounds with digital watermarking. This will allow a user to show the screens without accessing the virtual wallet interface and scrolling through to find the access card representation.
Instead of embedding digital watermark information in imagery for display by the mobile device, a virtual wallet may cause a speaker to emit a HF audio signal which includes system access information. A system microphone captures the HF audio and the system analyzes such to decode the information therefrom. In some cases, system access requires a combination of both audio and imagery. Visual Interfaces for Wearable Computers
The visual constructs provided above can also be utilized both in a wristwatch form- factor, and for users wearing glasses.
The paradigm of card selection can leverage the inherit properties of a watch form factor to facilitate selection. One implementation may consist of the user running a finger around the bezel (device presumed to be circular for this example), to effect scrolling through the stack of cards. Simple motion of the watch may facilitate the same navigation by tilting the watch (e.g., rotation at the wrist). Payment would be facilitated the same way by showing the wearer's wrist watch to the cooperating device.
For users of headworn devices, such as the Google Glass product, the selection and validation process may occur through gaze tracking, blinking or any other known UI construct. Associated with the glasses would be a secondary digital device containing a display (a smartphone, a digitally connected watch such as the Pebble, or possibly a media player). The selected card would be rendered on the secondary device to complete the transaction as before. Alternatively, a portable user device can project a display, for sensing by the POS system
Capturing imagery with eyewear may have additional benefits. For example, when capturing imagery of a point of sale (POS) display (e.g., at a reverse pathway checkout station mentioned above) a user may place a finger or subset of fingers in the eyewear's field of view. The camera captures the fingers (including the fingerprints) in the same image frames when it captures the display. A virtual wallet or a processor in communication with a virtual wallet may process captured imagery. Digital watermarks are decoded from the imagery corresponding to the display, and human fingerprint recognition is used to determine if the fingerprints correspond to an owner or authorized user of the virtual wallet. Authorization of a transaction can be conditioned on a successful biometric match.
In some authentication or transaction embodiments, imagery is only captured (or is only used for the authentication or transaction) when a finger(s) or fingerprint(s) is detected in the field of view. This ensures that captured (or used) imagery will include a fingerprint for analysis. Object recognition can analyze image data to detect the presence of a finger and then collect images that do.
Visual Tallies Fig. 11 shows an arrangement in which a checkout tally is presented on the user's smartphone as items are identified and priced by a point of sale terminal. In this embodiment, a user "signs" the touchscreen with a finger to signify approval.
A signature is technically not required for most payment card transactions, but there are advantages to obtaining a user's signature approving a charge. For example, some transaction networks charge lower fees if the users' express affirmance is collected. A finger-on- touchscreen signature lacks the fidelity of a pen-on-paper signature, but can still be distinctive. As part of a process of registering cards in a virtual wallet, a user's touchscreen signature can be collected. This signature, or its characterizing features, can be sent to one or more of the parties in the transaction authorization process shown in Fig. 5, who can use this initial signature data as reference information against which to judge signatures collected in subsequent transactions.
Alternatives to signatures can include finger or facial biometrics, such a thumbprint on the user's screen or capture of face using camera functions, or voiceprint, etc.
In the prior art, POS receipts detail items purchased in the order they are presented at checkout - which is perhaps the least useful order. An excerpt from such a receipt is shown in Fig. 12A. In accordance with a further aspect of the present technology, user preference information is stored in the phone and identifies the order in which items should be listed for that user.
Fig. 12B shows an alphabetical listing - permitting the user to quickly identify an item in the list. Fig. 12C shows items listed by price - with the most expensive items topping the list, so that the user can quickly see where most of the money is being spent.
Fig. 12D breaks down the purchased items by reference to stored list data. This list can be a listing of target foods that the user wants to include in a diet (e.g., foods in the
Mediterranean diet), or it can be a shopping list that identifies items the user intended to purchase. The first part of the Fig. 12D tally identifies items that are purchased from the list. The second part of the tally identifies items on the list that were not purchased. (Some stores may provide "runners" who go out to the shelves to fetch an item forgotten by the shopper, so that it can be added to the purchased items before leaving the store.) The third part of the Fig. 12D tally identifies items that were purchased but not on the list (e.g., impulse purchases).
Breakdown of purchased items in this fashion may help the user reduce impulse purchases. Image-Based Authentication
An additional layer of security in mobile payment systems can make use of imagery, e.g., captured by the smartphone.
Figs. 13A - 13C illustrate one such arrangement, used to further secure an American Express card transaction. The detailed arrangement is akin to the SiteKey system, marketed by RSA Data Security.
In particular, after the user selects the American Express virtual card from the
smartphone wallet, the phone sends related data to a cooperating system (which may be in data communication with American Express or RSA). Once the user/device/card is identified by such sent data, the cooperating system provides a challenge corresponding to that user/device/card for presentation on the phone screen. This challenge includes an image and a SiteKey phrase. In Fig. 13A the image is an excerpt of a quilt image, and the SiteKey is the name Mary Ann. Unlike the SiteKey system, however, the image is drawn from the user's own photo collection, stored on the smartphone that is now engaged in the authentication process. (In the present case, the user may have snapped a picture of the quilt while visiting a gift shop on vacation.) User- selection of one of the user's own images enables the user to select a SiteKey phrase that has some semantic relationship to the image (e.g., the user may have been with a friend Mary Ann when visiting the shop where the quilt was photographed).
The user verifies that the quilt image and the SiteKey word are as expected (to protect against phishing), and then is prompted to enter a Descriptor corresponding to the image. In the present case the Descriptor is the word Napa. (Again, this word may be semantically related to the displayed image and/or the SiteKey. For example, it may have been during a vacation trip to Napa, California, that the user and Mary Ann visited the shop where the quilt was photographed.) A cryptographic hash of the user-entered Descriptor is computed by the smartphone, and transmitted to the cooperating system for matching against reference Descriptor data earlier stored for that user's American Express account. If they match, a message is sent to the smartphone, causing it next to solicit the user's signature, as shown in Fig. 13C. (As in Fig. 11, the signature screen may also include a tally of the items being purchased, or other transaction summary.) After entry of the user's signature or other biometric indicia (and, optionally, checking of signature features against stored data), the transaction proceeds. In addition, or alternatively, the user's image or a user selected image may appear on the merchant's terminal screen permitting a challenge response verification of identity by the store clerk. A facial image can be manually checked and/or compared using facial biometrics algorithms.
Another challenge-response security system employs information harvested from one or more social network accounts of the user, rather than from the phone's image collection. For example, a user can be quizzed to name social network friends - information that may be protected from public inspection, but which was used in an enrollment phase. At both the enrollment phase, and in later use, the actual friends' names are not sent from the phone.
Instead, hashed data is use to permit the remote system to determine whether a user response (which may be selected from among several dummy data, as above) is a correct one.
Still other information that can be used in challenge-response checks is detailed in published application US 2012-0123959 Al.
Figs. 14 and 15 show a different authentication procedure. In this arrangement a challenge image 141 is presented, and the user is instructed to tap one of plural candidate images to identify one that is related to the challenge image. The correct, corresponding, image (142a in this case) is selected from the user's own collection of smartphone pictures (e.g., in the phone's Camera Roll data structure), as is the challenge image 141. If the user does not pick the correct candidate image from the presented array of images, the transaction is refused.
Fig. 15 details a preceding, enrollment, phase of operation, in which images are initially selected. The user is instructed to pick one image from among those stored on the phone. This user-picked image 141 is used as the reference image, and a copy of this image is sent to a cooperating system (e.g., at a bank or RSA Security). The user is next instructed to pick several other images that are related to the reference image in some fashion. (For example, all of the picked images may have been captured during a particular vacation trip.) These latter images are not sent from the phone, but instead derivative data is sent, from which these pictures cannot be viewed.
In the illustrated example, the user selects images taken during the vacation to Napa. An image of the quilt, photographed in the gift shop, is selected by the user as the reference image 141. This picture is a good choice because it does not reveal private information of the user (e.g., it does not depict any family members, and it does not reveal any location information that might be sensitive), so the user is comfortable sharing the image with an authentication service. The user then picks several other images taken during the same trip for use as related, matching images. In Fig. 15, the user-picked related images are indicated by a bold border. One shows two figures walking along a railroad track. Another shows a palm tree in front of a house.
Another shows plates of food on a restaurant table. Another shows red tomatoes arrayed along a counter. All are related by common geography and time interval (i.e., a vacation to Napa).
For the user-picked related images, no copies are sent from the phone. Instead, software in the phone derives image feature information. This image feature information may comprise, e.g., an image hash, or fingerprint, or color or texture or feature histograms, or information about dominant shapes and edges (e.g., content-based image descriptors of the sort commonly used by content-based image retrieval (CBIR) systems), etc. This derived information is sent from the phone for storage at the authentication service, together with identifying information by which each such related image can be located on the user's smartphone. (E.g., file name, image date/time, check-sum, and/or image file size.)
Returning to Fig. 14, when authentication is required (e.g., after a user/device/card has been identified for a transaction), the remote system sends the reference image 141 for display on the smartphone. The remote system also sends identifying information for one of the several related images identified by the user (e.g., for the picture of the tomatoes on the counter). The remote system also sends several dummy images.
The smartphone uses the identifying information (e.g., the image name) to search for the corresponding related image in the smartphone memory. The phone next presents this image (142a), together with the dummy images received from the authentication service (142b, 142c,
142d), on the phone display. The user is then invited to pick one of the plural candidate images
142 that is related to the reference picture 141.
The user's choice is compared against the correct answer. For example, the remote system may have instructed the smartphone to present the matching image (recalled from the phone's memory, based on the identification data) in the upper left position of the array of pictures. The phone then reports to the remote system the location, in the array of candidate pictures, touched by the user. If that touch is not in the upper left position, then the remote system judges the authentication test as failed.
In other arrangements, the location of the user's tap is not reported to the remote system. Instead, the smartphone computes derived information from the image tapped by the user, and this information is sent to the remote system. The remote system compares this information with the derived information earlier received for the matching (tomatoes) image. If they do not correspond, the test is failed.
In still other arrangements, the pass/fail decision is made by the smartphone, based on its knowledge of placement of the matching image.
Although not evident from the black and white reproduction of Fig. 14, each of the candidate images 142a - 142d is similar in color and structure. In particular, each of these images has a large area of red that passes through the center of the frame, angling up from the lower left. (That is, the roadster car is red, the notebook is red, and the ribbon bow is red.) This is possible because, in the illustrated embodiment, the derived information sent from the phone during the enrollment phase included color and shape parameters that characterized the matching images selected by the user. In selecting dummy images, the remote system searched for other images with similar color/shape characteristics.
This feature is important when the reference image and the matching images are thematically related. For example, if the user-selected reference and matching photos are from a camping trip and all show wilderness scenes, then a matching photo of a mountain taken by the user might be paired with dummy photos of mountains located by CBIR techniques. By such arrangement, the thematic relationship between a matching image and the reference image does not give a clue as to which of the candidate images 142 is the correct selection.
In the Fig. 14 example, the tomatoes photo was used as the matching image. The next time authentication is required, another one of the matching images earlier identified by the user can be used (e.g., the photo of a palm tree in front of a house).
It will be recognized that only the true user will be able to discern a relationship between the reference image 141, and one of the displayed candidate images 142, because only the true user knows the context that they share. Moreover, this authentication technique relies on images captured by the user, rather than "canned" imagery, as employed in the prior art.
Card Standards, Etc. Conventional magstripe credit cards conform to ISO standards 7810, 7811 and 7813, which define the physical and data standards for such cards. Typically, the data on the magstripe includes an account number, an owner name, a country code, and a card expiration date.
"Chip cards" include a chip - typically including a processor and a memory. The memory stores the just-listed information, but in encrypted form. The card employs a variety of common digital security techniques to deter attack, including encryption, challenge-response protocols, digital signatures, etc. Entry of a user's PIN is required for most transactions. Again, an ISO standard (7816) particularly defines the card requirements, and a widely used
implementation follows the EMV (EuroPay/MasterCard/Visa) standard. (An updated version of EMV, termed EMV Lite, is being promoted by Morpho Cards, GmbH.)
Artisans commonly speak of "static" and "dynamic" authentication methods.
"Static" authentication methods build on those known from magnetic stripe cards. In static authentication, information is conveyed uni-directionally, i.e., from the card, possibly through an intermediary (e.g., a POS system) to a testing system (e.g., a card issuer). Static techniques can employ digital signatures, public-private keys, etc. For example, the user's name may be hashed, digitally signed with a private key associated with the system (or issuer), and the results stored in a chip card for transmission to the POS system. The POS system receives this encrypted data from the card, together with the user name (in the clear). It applies the corresponding public key to decrypt the former, and compares this with a hash of the latter.
The present technology can be employed in systems using such known static
authentication, without any system alterations. Moreover, the present technology affords protection against replay attacks (e.g., through context-based techniques) - a liability to which conventional static authentication techniques are susceptible.
The more sophisticated authentication technique is so-called "dynamic authentication." This involves a back-and-forth between the payment credential and the testing system, and may comprise challenge-response methods.
With chip cards, the card-side of the transaction is conducted by the chip, for which the POS terminal commonly has a two-way dedicated interface. But the smartphone screen used in embodiments of the present technology - which optically provides information to the cooperating system - cannot reciprocate and receive information from that system. Nonetheless, the present technology is also suitable for use with dynamic authentication methods. The communication back from the system to the smartphone can be via signaling channels such as radio (NFC communication, WiFi, Zigbee, cellular) or audio. Optical signaling can also be employed, e.g., a POS terminal can be equipped with an LED of a known spectral characteristic, which it controllably operates to convey data to the phone, which may be positioned (e.g., laying on a checkout conveyor) so that the phone camera receives optical signaling from this LED.
Many chip-card dynamic authentication methods rely on key data stored securely in the chip. The same secure methods can be implemented in the smartphone. (Many Android phones already include this, to support the Google Wallet and similar technologies.) For example, the RSA secure architecture for SIM (microSD) cards or NFC chips, employing a tamper resistant Secure Element (SE) and a single wire protocol (SWP), can be used. The keys and other data stored in such arrangement can be accessed only via encrypted protocols.
In one particular implementation, the keys are accessed from the SE in the smartphone, and employed in a static authentication transaction (e.g., with information optically conveyed from the smartphone screen). The remote system may respond to the phone (e.g., by radio) with a request to engage in a dynamic authentication, in which case the smartphone processor (or the SE) can respond in the required back-and-forth manner.
In other arrangements, the key data and other secure information is stored in conventional smartphone memory - encrypted by the user's private key. A cloud resource (e.g., the card issuer) has the user's public key, permitting it to access this secure information. The POS system can delegate the parts of the transaction requiring this information to the issuing bank, based on bank-identifying information stored in the clear in the smartphone and provided to the POS system.
As noted, while chip cards are appealing in some aspects, they are disadvantageous because they often require merchants to purchase specialized reader terminals that have the physical capability to probe the small electrical contacts on the face of such cards. Moreover, from a user standpoint, the card is typically stored in an insecure container - a wallet. In the event a card is stolen, the only remaining security is a PIN number.
As is evident from the foregoing, embodiments of the present technology can employ the standards established for chip card systems and gain those associated benefits, while providing additional advantages such as cost savings (no specialized reader infrastructure required) and added security (the smartphone can provide many layers of security in addition to a PIN to address theft or loss of the phone).
The artisan implementing the present technology is presumed to be familiar with magstripe and chip card systems; the foregoing is just a brief review. Additional information is found, e.g., in the text by Rankl et al, Smart Card Handbook, 4th Ed., Wiley, 2010, and in the white paper, "Card Payments Roadmap in the United States: How Will EMV Impact the Future Payments Infrastructure?," Smart Card Alliance, Publication PC-12001, January, 2013. Notifications and Transaction Receipts, etc.:
A virtual wallet can facilitate receipt transmission and management. As part of a transaction checkout, the virtual wallet may request a receipt to be added to or accessible by the wallet - perhaps stored locally on the user device and/or in the cloud associated with a user or device account. For example, the virtual wallet communicates an account identifier, device ID or address to a participating terminal or vendor. In response, the terminal or vendor forwards the transaction receipt to the account, device or address. The user may be prompted through a UI provided by the virtual wallet to add searchable metadata about the transaction or receipt (e.g., warranty information). In other cases, searchable metadata is collected by the virtual wallet itself in addition to or without user intervention. Searchable metadata may be collected, e.g., by accessing and using transaction time, retailer name and location, items purchased, retention information, OCR-produced data if the receipt is in image form or .pdf format, etc. In some cases the receipt can be provided by the retailer with searchable text (e.g., in an XML file), e.g., including items purchased, return information, warranty information, store location and hours, price, etc. Searchable text can be indexed to facilitate rapid future searching. The receipt is accessible through the virtual wallet, e.g., by a user selecting a UTprovided icon next to a corresponding transaction.
The virtual wallet preferably provides a UI through which receipts and other transaction information may be searched. The user inputs information, e.g., types information or selects categories, products, retailers from scrollable lists, via the search UI. After a search is launched, corresponding receipt search results are represented on the display for review by the user. We mentioned above that receipts can be marked for retention. This is helpful, e.g., for items under warranty. Retention information can be used by the wallet to help expire receipts and other transaction information. For example, a user purchases a TV at Wal-Mart and a receipt is delivered for access by the virtual wallet. (In some cases the virtual wallet may receive a notification that a receipt is available for retrieval, and access a remote location to obtain receipt information.) Metadata is entered or accessed for the receipt and retention data is indexed or stored in an expiration table or calendar. The virtual wallet uses the expiration table or calendar to expire receipts no longer deemed important or needed. The term "expire" in this context may include deleting the receipt, deleting metadata associated with the receipt, and/or updating any remote storage of such.
Retention data can be augmented with any auction related information. For example, we mentioned above that a certain financial bidder may offer an extended warranty if a transaction is made using their account or service. Such a warranty extension may be added to the retention information so a receipt is not prematurely expired.
Receipts and the metadata associated with such can be updated to reflect returns or refunds.
The searchable metadata may also include notification information. For example, a user may be on the fence whether to keep the latest electronic gizmo purchased on a whim last week. In this case the use has 15 days (or other according to the store's return policy) to return the item. Notification information can be stored and calendared for use by the virtual wallet (or a cooperating module) to send the user a reminder, e.g., via email, SMS or display notification pop-up via a UI, so that the 15 days doesn't come and go without notice.
Notifications need not be limited to receipts and warranty information. The virtual wallet may manage and provide many different types of notifications. For example, bill- payment due dates, account balances, credit limits, offers, promotions and advertising are just a few examples of such. Push-messages may be generated for urgent items in addition to having some type of a visual cue or icon within the virtual wallet that would indicate that my attention is needed. For example, a particular card or account in FIG. 3A may have a notification associated with it. (E.g., the user may have forgotten to authorize a monthly payment by its due date.) The depicted card may jiggle, glow, shimmer, flash, strobe and/or break into an animated dance when the virtual wallet is accessed. This type of notification will visually alert the user to investigate the card further, and upon accessing such (e.g., by double tapping the animated card) the notification can be further displayed.
Medical and insurance information may also be stored and managed in a virtual wallet. In addition to a health insurance card, users have car insurance card(s), Medicare card(s), an Intraocular Lens card, and a Vaccess Port card, etc. Unlike bank cards, some of this info is preferably accessible without unlocking a mobile device that is hosting the virtual wallet, e.g., because if a user needs emergency medical care, they may not be conscious to unlock the device. Access to such emergency medical information may be accomplished by adding an Emergency Medical button to a device's unlock screen similar to the Emergency Call button. A user can determine which information they want to provide access to via an Emergency Medial button through an operating systems settings screen or an access user interface associated with the virtual wallet. In another embodiment, emergency responders have an RFID card, NFC device or a digitally watermarked card that can be sensed by the mobile device to trigger unlocking the screen of a mobile device. In other cases, desired medial or insurance information is information is available on an initial splash screen, even if the phone is locked, and without needing to access an Emergency Medical button.
Of course, some or all the information hosted by the virtual wallet can be stored in the cloud or at a remote location so that it is accessible from various user devices programmed with the virtual wallet (e.g., a virtual wallet app) or to cooperate with the virtual wallet and through which a user' s identity is authenticated.
Game Consoles and Physical Sales of Virtual Items:
Another device on which a virtual wallet can operate on is a game console. Examples of gaming platforms include Microsoft's Xbox 360, Sony's PlayStation, Nintendo's DS and Wii Kyko PlayCube, OnLive's MicroConsole (a cloud-based gaming console), etc.
One advantage of coupling a virtual wallet to a game console is the ability to monetize and transfer virtual items. Consider the following: after a long night of gaming a user finally wins a rare virtual prize, e.g., a unique power, token, provisions, code, level access, spell or weapon. The virtual prize can be stored or accessed within the user's virtual wallet. For example, the prize may be represented by an XML file, an access code, a cryptographic code, software code, or a pointer to such. The virtual wallet can facilitate the on-line sale or transfer (e.g., via eBay) of the virtual prize for real money or credit. The wallet may include a virtual prize directory, folder or screen. An eBay (or sell) icon may be displayed next to the virtual prize to allow a user to initiate a transfer, auction or sale of the virtual prize. Selecting the icon initiates an offer to sell, and prompts the virtual wallet to manage the interaction with eBay, e.g., by populating required For Sale fields gathered from the virtual prize's metadata, or prompting the user to insert additional information. (The virtual wallet can access an eBay API or mobile interface to seamlessly transfer such data.)
Upon a successfully sale, the virtual wallet can be used to transfer the virtual prize to the winning purchaser using the techniques (e.g., purchase) discussed in this document.
Anonymous trust; Pick-Pocketing; and Security:
A virtual wallet may also provide an indication of trust. A user may accumulate different trust indicators as they forage online, participate in transactions and interact in society. For example, a user may receive feedback or peer reviews after they participate in an online transaction, auction or in a retail store. Another trust indicator may be a verification of age, residency and/or address. Still another trust indicator may be a criminal background check performed by a trusted third party. The virtual wallet may aggregate such indicators from a plurality of different sources to determine a composite trust score for the user. This trust score can be provided to potential bidder in a financial auction as a factor in deciding whether to offer a bid, and the content of such. The trust score can also be provided as the user interacts through social media sites.
In some cases, the trust score is anonymous. That is, it provides information about a user without disclosing the user's identity. A user can then interact online in an anonymous manner but still convey an indication of their trustworthiness, e.g., the virtual wallet can verify to others that a user is not a 53 year old pedophile, while still protecting their anonymity.
To help prevent digital pickpocketing a virtual wallet may be tethered (e.g., include a cryptographical relationship) to device hardware. For example, a mobile device may include an SID card identifier, or may include other hardware information, which can be used as a device identifier. A virtual wallet may anchor cards within the wallet to the device identifier(s) and, prior to use of a card - or the wallet itself - checks the device identifier(s) from the device with the device identifier(s) in the virtual wallet. The identifiers should correspond in a predetermined manner (e.g., cryptographical relationship) before the virtual wallet allows a transaction. This will help prevent a wallet from being copied to a device that is not associated with the user. (Of course, a user may authorize a plurality of different devices to cooperate with their virtual wallet, and store device identifiers for each.)
In some cases, a virtual wallet may send out a notification (e.g., to the user, credit reporting agency, or law enforcement) if the virtual wallet detects unauthorized use like use of the wallet on an unauthorized device.
In other cases, the virtual wallet gathers information associated with a user's patterns and purchases. After building a baseline, it can notify a user, financial vendor or others when it detects activity that looks out of character (e.g., suspected as fraud) relative to the baseline. For example, the baseline may reflect a geographic component (e.g., North America) and if spending is detected outside of this component (e.g., in Europe) then a notification can be generated and sent. The baseline may also access or incorporate other information to help guide its decision making. For example, the virtual wallet may access a user's online or locally stored calendar and determine that the user is traveling in Europe on vacation. So the geographical component is expanded during the vacation time period and a notification is not sent when European spending is detected.
Combinations
Some combinations supported this disclosure include the following. Of course, the following is no-where near an exhaustive listing, since there are many, many other combinations that will be readily apparent from the above written description. We expressly reserve our right to file continuation and divisional applications (and to amend claims to include) including combinations and features presented below. Of course, some continuation and divisional applications may include other combinations that are apparent from this specification as well.
Al. A method employing a user's portable device, the device including a display, one or more processors and a sensor, the method including acts of:
receiving information from the sensor, the information corresponding to a positioning or relative movement of the portable device; using the one or more processors, and based at least in part on the information, changing a digital watermark embedding process;
using the one or more processors, embedding a digital watermark in imagery using the changed digital watermark embedding process;
providing the embedded imagery for display.
A2. The method of Al in which the sensor comprises a gyroscope.
A3. The method of Al in which said changing a digital watermark embedding process comprises changing a relative embedding strength.
B 1. A portable device comprising:
a touch screen display;
a sensor to obtain information corresponding to a positioning or relative movement of the portable device;
memory storing an image; and
one or more processors configured for:
changing a digital watermark embedding process based on information obtained by said sensor;
embedding a digital watermark in the image using the changed digital watermark embedding process;
controlling display of the embedded image on the touch screen display.
B2. The portable device of Bl in which the sensor comprises a gyroscope.
B3. The portable device of Bl in which the changing a digital watermark embedding process comprises changing a relative embedding strength.
CI. A portable device comprising:
a touch screen display;
a microphone for capturing ambient audio; memory for storing audio identifiers or information obtained from audio identifiers; and one or more processors configured for:
causing the portable device to operate in a background audio collection mode, in which during the mode audio is captured by the microphone without user involvement; processing audio captured in the background audio collection mode to yield one or more audio identifiers;
storing the one or more audio identifiers or information obtained from the one or more identifiers in said memory;
upon encountering a transmission from a signaling source, determining if the one or more audio identifiers or if the information obtained from the one or more identifiers stored in memory corresponds to the transmission;
taking an action if there is a correspondence.
C2. The portable device of CI in which the signaling source comprises an iBeacon or Bluetooth transmitter.
C3. The portable device of C2 in which the information obtained from the one or more audio identifiers comprises a discount code or coupon, and in which the action comprises applying the discount code or coupon to a financial transaction involving the portable device.
C4. The portable device of CI in which the processing audio comprises extracting fingerprints from the audio.
C5. The portable device of CI in which the processing audio comprises decoding digital watermarking hidden in the audio.
C6. The portable device of CI in which the action comprises prompting the user via a message displayed on the touch screen display. Dl. A system comprising:
a portable device comprising: one or more processors, a high frequency audio transmitter and receiver, and a virtual wallet stored in memory, the virtual wallet comprising financial information;
a retail station comprising: one or more processors, a high frequency audio transmitter and receiver;
in which the virtual wallet configures the one or more processors of the portable device to transmit a known high frequency audio message, the message being known to both the virtual wallet and to the retail station;
in which the one or more processors of the retail station are configured to determine errors associated with the known high frequency audio message and cause an error message to be communicated to the virtual wallet;
and in which the virtual wallet, upon receipt of the error message, configures said one or more processors to transmit the financial information with a high frequency audio signal adapted according to the error message.
El. A portable device comprising:
a touch screen display;
a microphone for capturing ambient audio;
memory for storing an image; and
one or more processors configured for:
generating copies of the stored image;
obtaining a payload corresponding to financial information;
providing the payload to an erasure code generator, in which the erasure code generator produces a plurality of outputs;
embedding one of the plurality of outputs in a copy of the stored image and proceeding with embedding until each of the plurality of outputs is so embedded in a copy of the stored image, in which the embedding utilizes digital watermarking;
causing the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction.
E2. The portable device of El in which the obtaining comprises generating the payload based on user input and on the financial information.
E3. The portable device of El in which said one or more processors are configured to operate as the erasure code generator, and in which the erasure code generator comprises a fountain code generator, in which the fountain code generator produces the plurality of outputs, from which a receiver can reassemble the payload by obtaining a subset of the plurality of outputs, the subset being less than the plurality of outputs.
E4. The portable device of El in which only one output of the plurality of outputs is embedded in any one image copy.
E5. The portable device of El in which said one or more processors are configured for: i) generating a perceptibility map of the image, ii) storing the perceptibility map in said memory, and iii) reusing the perceptibility map when embedding the plurality of outputs in corresponding image copies.
E6. The portable device of El further comprising an audio transmitter, in which said one or more processors are configured to cause said audio transmitter to transmit an audio signal corresponding to the financial information. E7. The portable device of E6 in which said audio transmitter comprises a high frequency audio transmitter.
E8. The portable device of E6 in which the audio signal comprises a pin, key or hash. E9. The portable device of El in which the plurality of outputs comprises a subset of a total number of outputs provided by the erasure code generator. E10. The portable device of El in which said one or more processors are configured to interpret the user input which is received via said touch screen display.
El 1. The portable device of El in which said one or more processors are configured to cause the embedded image copies to be displayed so that a digital watermark reader analyzing captured image data representing the display can recover the payload.
Fl. A method employing a user's portable device, the device including a touch screen display, one or more processors and a sensor, the method including acts of:
obtaining a payload corresponding to financial information;
providing the payload to an erasure code generator, in which the erasure code generator produces a plurality of outputs;
generating copies of an image;
embedding one of the plurality of outputs in a copy of the stored image and proceeding with embedding until each of the plurality of outputs is so embedded in a copy of the stored image, in which the embedding utilizes digital watermarking;
causing the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction.
F2. The method Fl in which the obtaining a payload comprises generating the payload based on user input and on the financial information.
F3. The method of Fl further comprising: causing the erasure code generator to produce the plurality of outputs, in which the erasure code generator comprises a fountain code generator, in which the fountain code generator produces the plurality of outputs, from which a receiver can reassemble the payload by obtaining a subset of the plurality of outputs, the subset being less than the plurality of outputs. F4. The method of Fl in which only one output of the plurality of outputs is embedded in any one image copy. F5. The method of Fl further comprising: i) generating a perceptibility map of the image, ii) storing the perceptibility map in memory, and iii) reusing the perceptibility map when embedding the plurality of outputs in image copies.
F6. The method of Fl, in which the portable device comprises an audio transmitter, said method further comprising transmitting an audio signal corresponding to the financial information, said transmitting using the audio transmitter.
F7. The method of F6 in which the audio transmitter comprises a high frequency audio transmitter.
F8. The method of F6 in which the audio signal comprises a pin, key or hash.
F9. The method of Fl in which the plurality of outputs comprises a subset of a total number of outputs provided by the erasure code generator.
F10. The method of Fl further comprising interpreting the user input which is received via said touch screen display.
Fl l. The method of Fl in which the embedded image copies are displayed so that a digital watermark reader analyzing captured image data representing the display can recover the payload
Gl. A method employing a user's portable device, the device including a touch screen display, one or more processors and a sensor, the method including acts of:
obtaining a payload;
processing the payload with an erasure code generator, in which the erasure code generator produces a plurality of outputs corresponding to the payload;
obtaining portions of imagery; embedding one of the plurality of outputs in a portion of the imagery and proceeding with embedding until each of the plurality of outputs is so embedded in a portion of the imagery, in which the embedding utilizes digital watermarking;
causing the touch screen display to display embedded portions of imagery so that a digital watermark reader analyzing captured image data representing the display can recover the payload.
G2. The method of Gl in which the portions comprise video frames or copies of an image.
G3. The method of Gl in which the erasure code generator comprises a fountain code generator.
HI. A method employing a user's portable device, the device including a display and a sensor module, the method including the acts:
presenting a payment user interface using the display, the user interface identifying plural virtual wallet cards including plural payment service cards, said payment service cards representing plural possible payment services including at least one of American Express, VISA and MasterCard, the user interface enabling a user to select a desired one of said payment services for issuing a payment;
generating context-based authentication data, the authentication data depending in part on data from the device sensor module;
presenting artwork using the display, the artwork indicating the selected payment service and including a logo for American Express, Visa, or MasterCard; and
providing information, including both the context-based authentication data and data corresponding to the selected payment service, from the device to a cooperating system, in connection with issuing a payment;
wherein:
the logo in the presented artwork confirms to the user that the desired payment service has been selected for a payment; and the method enables the user to issue payments using a user-selected one of said plural payment services, without requiring the user to carry plural physical cards for said payment services.
H2. The method of HI in which the artwork includes a machine -readable representation of said information, wherein said information is provided optically from the device to the cooperating system.
H3. The method of HI in which the user interface enables the user to select plural of said virtual wallet cards, one of the selected cards being a payment service card, and another of the selected cards being a merchant card, the method further including providing data corresponding to both the payment service card and the merchant card to the cooperating system.
H4. The method of HI in which the authentication data depends in part on data from a sensor module selected from the group: an audio sensor, a motion sensor, a pose sensor, a barometric pressure sensor, and a temperature sensor.
H5. The method of HI in which the method further includes prompting the user for entry of correct validation information before the information is provided to the cooperating system, and wirelessly sending location data to a recipient if N consecutive attempts to enter correct validation information fail.
H6. The method of HI in which the authentication data is also user device-based, wherein the authentication data is logically bound to both context and to the user device.
H7. The method of HI that further includes presenting said payment user interface in response to user activation of a control in an online shopping user interface, through which the user has selected one or more items for purchase.
II. A method of alleviating piriformis syndrome, while still allowing card-related payment transactions, the method comprising the acts: for each of plural physical payment cards in a user's wallet, storing a virtual counterpart thereto in a user's portable device, each of said physical payment cards in the user's wallet having a payment service associated therewith;
removing said plural physical payment cards from the user's wallet, to thereby reduce the volume of the wallet, and attendant compression of the user's sciatic nerve; and
initiating a payment using a user-selected payment service, said user-selected payment service being associated with one of the cards removed from the user's wallet, said initiating including:
sensing context data;
producing authentication data based, in part, on said sensed context data; and
presenting artwork using a display of the portable device, for optical sensing by a cooperating system, the artwork including a machine-readable representation of the
authentication data, and also including a logo associated with said user-selected payment service.
Jl. A shopping method in which a user has used a portable wireless device to select an item for purchase, the method comprising the acts:
sensing a user input;
responsive to said input, employing a software module resident on the portable wireless device to initiate a payment process for the selected item;
wherein:
the payment process includes performing a first authentication act that makes use of data captured by a camera or microphone of the portable wireless device; and
the payment process includes performing a second authentication act that makes use of data generated by a MEMS sensor of the portable wireless device.
Kl. A method practiced after loss of a user's portable device, which device contained a software module permitting selection of a payment card from among plural payment card options in a user's virtual wallet for use in a payment transaction, the method comprising the acts:
at the remote repository, responsive to a request from an authenticated party, revoking an ability of the lost portable device to be used in payment transactions involving said payment card options ; receiving an identification of a replacement portable device; and
associating the user's virtual wallet with the replacement portable device, thereby enabling the user to select a payment card from among plural payment card options in the user' s virtual wallet for use in a payment transaction, using said replacement device.
LI. An improved checkout system including a camera, a processor, and a memory, the memory containing instructions configuring the checkout system to perform acts including:
using the camera to capture first image data depicting a product presented for purchase by a user, to aid in identifying the product;
also using the same camera to capture second image data depicting artwork from a display of a user portable device, the artwork being associated with a payment service and including a VISA, MasterCard or American Express logo, the artwork also including machine- readable data encoding plural bit auxiliary data;
decoding the auxiliary data; and
using the decoded auxiliary data in connection with authenticating a payment transaction serviced by VISA, MasterCard or American Express.
L2. The checkout system of LI in which the first image data comprises the second image data.
L3. The checkout system of LI in which the instructions configure the checkout system to derive context information about the portable device by processing the second image data, wherein said derived context information is used in conjunction with the decoded auxiliary data to authenticate the payment transaction.
L4. The checkout system of L3 in which the derived context information
comprises information about pose or motion of the user portable device.
Ml. A method employing a user's portable device, comprising: presenting, using a display of the device, a user interface that presents plural virtual wallet cards;
receiving user input selecting two of said virtual wallet cards; and
providing information corresponding to said two virtual wallet cards to a cooperating system, in connection with a purchase transaction.
M2. The method of Ml in which a first of said two selected virtual wallet cards is associated with an American Express, Visa, or MasterCard payment service, and a second of said two selected virtual wallet cards is associated with a merchant.
M3. The method of M2 which the two selected wallet cards comprise two virtual payment cards, and the method further includes providing a user interface feature enabling the user to apportion part of a payment to a first of the payment cards, and the balance of the payment to a second of the payment cards.
M4. The method of M3 in which said user interface feature comprises a touch-activated slider feature.
M5. The method of M3 in which the method further includes presenting a graphical image of a composite payment card, the graphical image including incomplete artwork associated with the first payment card, combined with incomplete artwork associated with the second payment card.
Nl. A method comprising the acts:
providing data to a service provider, the data including a first image, and image-derived information corresponding to one or more further images, the first and further images having been captured by the user and related to each other, the further images not being viewable from the image-derived information; and
presenting to the user an authentication challenge based on the provided information; wherein at least one of the foregoing acts is performed using a processor configured to perform such act(s). N2. The method of Nl in which the image-derived information comprises a hash or fingerprint derived from the further image(s). N3. The method of Nl in which the image-derived information comprises content-based image descriptors derived from the further image(s).
N4. The method of Nl in which the authentication challenge comprises:
presenting to the user the first image, and plural second images, one of said second images being one of said further images for which image-derived information was provided; inviting the user to identify one of the second images as being related to the first image; receiving from the user a selection of one of said second images; and
checking the user's selection. N5. The method of N4 that includes checking the user's selection by reference to the image-derived information.
01. An authentication method practiced using a smartphone, characterized by presenting images on a screen of the smartphone and receiving a user response thereto, wherein two of said images were earlier captured by the user with a camera portion of said smartphone.
PI. A method employing a user's portable device, the device including a display and a sensor, the method including acts of:
initiating a multi-party auction to solicit bids from a plurality of financial vendors to facilitate a financial transaction for the user, the plurality of remotely-located financial vendors being associated with the user via a virtual wallet hosted on the user's portable device;
receiving bids from the plurality of financial vendors;
presenting a user interface using the display, the user interface identifying at least two bids solicited from the multi-party auction; upon receiving an indication of a user-selected bid from the at least two bids, initiating a financial transaction using at least some of the details in the user selected bid and information obtained from the virtual wallet. P2. The method of PI in which the virtual wallet provides information associated with: i) the user, and ii) the financial transaction, to the plurality of financial vendors.
P3. The method of PI in which said initiating a multi-party auction commences with user input.
P4. The method of PI in which said initiating a multi-party auction commences upon analysis of GPS information.
P5. The method of PI in which the sensor comprises a microphone, and said initiating a multi-party auction commences upon analysis of microphone captured audio.
P6. The method of PI in which prior to said initiating a financial transaction, the method further comprises determining whether the financial transaction seems out of character relative to a baseline, in which the baseline includes user calendar information.
P7. The method of P6 further comprising issuing a notification when the financial transaction seems out of character.
Ql. A method employing a user's portable device, the device including a display and a sensor, the method including acts of:
initiating a multi-party solicitation for offers from a plurality of financial vendors to facilitate a financial transaction for the user, the plurality of remotely-located financial vendors being associated with the user via a virtual wallet hosted on the user's portable device;
receiving offers from the plurality of financial vendors;
selecting one of the offers according to predetermined criteria, without human
intervention at the time of said receiving; presenting a user interface using the display, the user interface displaying information associated with the selected one offer; and
upon receiving an indication through the user interface, initiating a financial transaction using at least some of the details in the selected one offer and information obtained from the virtual wallet.
Q2. The method of Ql in which the predetermined criteria comprises weighting factors.
Q3. The method of Ql in which the time of said receiving comprises a time period in the range of .1 milsecond to 90 seconds before and after said receiving.
Q4. The method of Ql in which said initiating a multi-party solicitation commences upon analysis of GPS information.
Q5. The method of Q4 in which the sensor comprises a microphone, and said initiating a multi-party solicitation commences upon analysis of microphone captured audio.
Q5. The method of Ql in which prior to said initiating a financial transaction, the method further comprises determining whether the financial transaction seems out of character relative to a baseline, in which the baseline includes user calendar information.
Rl. A method employing a user's portable device, the device including a display, the method including acts of:
presenting a user interface on the display through which a user can enter emergency medical information;
storing the emergency medical information;
providing a graphical user interface that allows access to stored emergency medical information via the display notwithstanding the portable device being in a screen-locked condition. S I. A method employing a user's portable device, the device including a display, the method including the acts:
presenting a payment user interface using the display, the user interface identifying plural virtual wallet cards including plural payment service cards, said payment service cards representing plural possible payment services including at least one service from a group of services offered by American Express, VISA and MasterCard, the user interface enabling a user to select a desired one of said payment services for issuing a payment;
presenting artwork using the display, the artwork including a logo for American Express, Visa, or MasterCard; and
causing the logo to graphically change to indicate a notification associated with the virtual wallet card represented by the logo, in which the change comprises at least one of: a jiggle, glow, shimmer, flash, strobe or animated dance.
A portable device comprising:
a touch screen display;
a microphone;
memory storing a virtual wallet including information associated with a plurality of financial vendors; and
more processors configured for:
facilitating a transaction using payment information associated with financial vendor in the virtual wallet;
receiving a receipt for the transaction from a remote location;
storing the receipt in said memory along with information pertaining to: i) expiration of the receipt, and ii) the transaction.
Ul. A portable device comprising:
a touch screen display;
a microphone for capturing ambient audio;
memory for storing audio identifiers or information obtained from audio identifiers; and one or more processors configured for: causing the portable device to operate in a background audio collection mode, in which during the mode audio is captured by the microphone without user involvement; processing audio captured in the background audio collection mode to yield one or more audio identifiers;
storing the one or more audio identifiers or information obtained from the one or more identifiers in said memory;
upon encountering a transmission from a signaling source, determining if the one or more audio identifiers or if the information obtained from the one or more identifiers stored in memory corresponds to the transmission;
taking an action if there is a correspondence.
U2. The portable device of Ul in which the signaling source comprises an iBeacon or Bluetooth transmitter. U3. The portable device of U2 in which the information obtained from the one or more audio identifiers comprises a discount code or coupon, and in which the action comprises applying the discount code or coupon to a financial transaction involving the portable device.
U4. The portable device of Ul in which the processing audio comprises extracting fingerprints from the audio.
U5. The portable device of Ul in which the processing audio comprises decoding digital watermarking hidden in the audio. U6. The portable device of Ul in which the action comprises prompting the user via a message displayed on the touch screen display.
U7. The portable device of Ul in which the action comprise displaying an in-store map or directions to a product associated with the one or more identifiers. VI. A method comprising:
using a high frequency audio channel, transmitting a first code from a checkout terminal to a mobile device, the code being encrypted with a private key, which the mobile device may decrypt with a corresponding public key;
using the high frequency audio channel, transmitting a second, different code from the checkout terminal to the mobile device, the second, different code comprising an unencrypted code;
receiving an audio signal at the checkout terminal emitted from the mobile device, the audio signal comprising a processed result of the first code and the second code;
verifying that the checkout terminal is communicating with the mobile device based on the processed result and on a timing associated transmitting and receiving signals.
V2. The method of VI in which the processed result comprises the first code XOR'ed with the second code.
V3. The method of VI in which the first code comprises a pseudo random bit sequence.
V4. The method of VI in which the high frequency audio channel comprises a parametric speaker for transmitting focused beams of sound.
Wl. A portable device comprising:
a touch screen display;
a video camera;
a microphone for capturing ambient audio;
memory for storing an image, and for storing components of a virtual wallet; and one or more processors configured for:
controlling the video camera to capture imagery corresponding to a checkout terminal's display, the display displaying imagery including digital watermarking information hidden therein, the information including transaction information; processing captured imagery to decode the digital watermarking to obtain the transaction information;
receiving user input corresponding to payment information included in a component of the virtual wallet;
controlling communication with a remotely located third party, so that the transaction information and payment information are provided to the third party; outputting a request for user confirmation based on a request received from the third party;
controlling communication with the remotely located third party so that a user confirmation is provided to the third party.
W2. The portable device of Wl in which the display imagery comprises a plurality of image versions, with each version including at least one fountain code generator output.
W3. The portable device of Wl in which the payment information includes a timestamp.
W4. The portable device of Wl in which request received from the third party includes a verification clue, and the request for user confirmation displays at least a portion of the verification clue.
W5. The portable device of Wl in which the one or more processors are programmed for controlling display of the at least a portion of the verification clue.
W6. The portable device of Wl further comprising a speaker, in which the one or more processors are configured for controlling output of a high frequency audio signal via the speaker, the high frequency audio signal comprising a message for the checkout terminal.
XI. A method comprising:
analyzing imagery collected with a wearable camera for the presence of a human biometric;
capturing imagery which includes the presence of the human biometric; analyzing such captured imagery for the presence of digital watermarking, and decoding the digital watermarking when present to obtain information carried therein;
analyzing such captured imagery to verify the human biometric;
initiating a transaction using the digital watermarking information when the human biometric is verified.
X2. The method of XI in which the transaction comprises payment.
X3. The method of XI in which the imagery corresponds to a display, in which the display comprises the digital watermarking hidden in displayed images.
Concluding Remarks
From the above description, it will be seen that embodiments of the present technology preserve the familiar ergonomics of credit card usage, while streamlining user checkout. No longer must a user interact with an unfamiliar keypad at the grocery checkout to pay with a credit card (What button on this terminal do I press? Enter? Done? The unlabeled green one?). No longer must the user key- in a phone number on such a terminal to gain loyalty shopper benefits. Additional advantages accrue to the merchant: no investment is required for specialized hardware that has utility only for payment processing. (Now a camera, which can be used for product identification and other tasks, can be re-purposed for this additional use.) And both parties benefit by the reduction in fraud afforded by the various additional security
improvements of the detailed embodiments.
Having described and illustrated the principles of our inventive work with reference to illustrative examples, it will be recognized that the technology is not so limited.
For example, while the specification focused on a smartphone exchanging data with a cooperating system using optical techniques, other communication arrangements can be used. For example, radio signals (e.g., Bluetooth, Zigbee, etc.) may be exchanged between the phone and a POS system. Relatedly, NFC and RFID techniques can also be used.
In some embodiments, audio can also be used. For example, card and authentication data can be modulated on an ultrasonic carrier, and transmitted from the phone's speaker to a microphone connected to the POS terminal. The POS terminal can amplify and rectify the sensed ultrasonic signal to provide the corresponding digital data stream. Alternatively, an audible burst of tones within the human hearing range can be employed similarly.
In another audio embodiment, the data is conveyed as a watermark payload,
steganographically conveyed in cover audio. Different items of cover audio can be used to convey different information. For example, if the user selects a VISA card credential, a clip of Beatles music, or a recording of a train whistle, can serve as the host audio that conveys the associated authentication/card information as a watermark payload. If the user selects a
MasterCard credential, a BeeGees clip, or a recording of bird calls, can serve as the host audio. The user can select, or record, the different desired items of cover audio (e.g., identifying songs in the user's iTunes music library, or recording a spoken sentence or two), and can associate different payment credentials with different of these audio items. The user can thereby conduct an auditory check that the correct payment credential has been selected. (If the user routinely uses a Visa card at Safeway - signaled by the Beatles song clip, and one day he is surprised to hear the BeeGees song clip playing during his Safeway checkout, then he is alerted that something is amiss.)
While watermarking and barcodes have been expressly referenced, other optical communications techniques can also be used. One simply uses pattern recognition (e.g., image fingerprinting, or OCRing) to recognize a payment card by the presented artwork and, in some implementations, read the user name, account number, expiration date, etc., from the artwork.
While the detailed payment arrangements provide card data (e.g., account name and number), from the smartphone to the cooperating system (typically in encrypted form), in other embodiments, such information is not conveyed from the phone. Instead, the phone provides a data token, such as a digital identifier, which serves to identify corresponding wallet card data stored in the cloud. (A related approach is used, e.g., by Braintree's Venmo payment system, which "vaults" the credit card details in a central repository.) Known data security techniques are used to protect the exchange of information from the cloud to the retailer's POS system (or to whatever of the parties in the Fig. 5 transaction system first receives the true card details). The token is useless if intercepted from the phone, because its use cannot be authorized except by using techniques such as disclosed above (e.g., context-based authentication data, digital signatures, etc.). Token-based systems make it easy for a user to handle loss or theft of the smartphone. With a single authenticated communication to the credentials vault, the user can disable all further use of the payment cards from the missing phone. (The authenticated user can similarly revoke the public/private key pair associated with user through the phone's hardware ID, if same is used.) After the user has obtained a replacement phone, its hardware ID is communicated to the vault, and is associated with the user's collection of payment cards. (A new public/private key pair can be issued based on the new phone's hardware ID, and registered to the user with the certificate authority.) The vault can download artwork for all of the virtual cards in the user's collection to the new phone. Thereafter, the new phone can continue use of all of the cards as before.
Desirable, in such embodiments, is for the artwork representing the wallet cards to be generic, without any personalized identification (e.g., no name or account number). By such arrangement, no personal information is conveyed in the replacement artwork downloaded to the new phone (nor is any personal information evident to a person who might gain possession of the lost/stolen original phone).
In an alternate implementation the virtual card data stored on the phone is logically- bound to the phone via the device ID, so that such data is not usable except on that phone. If the phone is lost or stolen, the issuer can be notified to revoke that card data and issue replacement data for installation on a replacement phone.
In still another embodiment, card data can be revoked remotely in a lost or stolen phone, using the iCloud Find My iPhone technology popularized by the Apple iPhone for remotely locking or wiping a phone.
While any combination of layered security techniques can be employed, one involves public -private key pairs issued to banks that issue payment cards. Among the information conveyed from the smartphone can be credit card account details (name, number, expiration data, etc.) provided to the phone by the issuing bank at time of virtual card issuance, already encrypted by the bank's private key. The POS system can have, stored in memory, the public keys for all credit card-issuing banks. The POS system can apply the different public keys until it finds one that decrypts the information conveyed from the smartphone, thereby assuring that the card credentials are issued by the corresponding bank. In the detailed arrangements, a POS system makes a context-based assessment using information conveyed from the smartphone (e.g., optically conveyed from its display). In other embodiments, the roles can be reversed. For example, the POS terminal can convey context information to the smartphone, which makes an assessment using context information it determines itself. Some systems use both approaches, with the smartphone testing the POS terminal, and the POS terminal testing the smartphone. Only if both tests conclude satisfactorily does a transaction proceed.
Technology for steganographically encoding (and decoding) watermark data in artwork (and sound) is detailed, e.g., in Digimarc's patent documents 6,614,914, 6,590,996, 6,122,403, US 2010-0150434 Al, US 2011-0274310 Al, and US 2013-0223673 Al. Typically, forward error correction is employed to assure robust and accurate optical conveyance of data.
The steganographic data-carrying payload capacity of low resolution artwork is on the order of 50-100 bits per square inch. With high resolution displays of the sort now proliferating on smartphones (e.g., the Apple Retina display), much higher data densities can reliably be achieved. Still greater data capacity can be provided by encoding static artwork with a steganographic movie of hidden data, e.g., with new information encoded every tenth of a second. Using such techniques, payloads in the thousands of bits can be steganographically conveyed.
Image fingerprinting techniques are detailed in patent publications 7,020,304 (Digimarc), 7,486,827 (Seiko-Epson), 20070253594 (Vobile), 20080317278 (Thomson), and US 2002
0044659 Al (NEC). SIFT-based approaches for image recognition can also be employed (e.g., as detailed in patent 6,711,293). SURF and ORB are more recent enhancements to SIFT.
Applicant's other work that is relevant to the present technology includes that detailed in patent publications US 2011-0212717 Al, US 2011-0161076 Al, US 2012-0284012 Al, US 2012-0046071 Al, US 2012-0214515 Al, and in pending applications 13/651,182, filed October 12, 2012 and 61/745,501, filed December 21, 2012.
Related patent publications concerning mobile payment and imaging technologies include US 2012-0303425 Al, US 2012-0024945 Al, US 2010-0082444 Al, US 2011-0119156 Al, US 2010-0125495 Al, US 2013-0085941 Al, US 2009-0276344 Al, 8,423,457, 8,429,407, 8,250,660, 8,224,731, 7,508,954, and 7,191,156. Although the detailed description focuses on use of the technology in bricks and mortar stores, the technology is equally useful in making purchases online.
For example, a user may employ a smartphone to browse the web site of an online merchant, and add items to a shopping cart. The merchant may have a dedicated app to facilitate such shopping (e.g., as EBay and Amazon do). At the time for payment, the user (or the web site, or the app) invokes the payment module software, causing one of the depicted interfaces (e.g., Fig. 1 or Fig. 10A) to be presented for user selection of the desired payment card. For example, an app may have a graphical control for selection by the user to activate the payment module. The user then flips through the available cards and taps one to complete the purchase. The payment module determines the device context from which it was invoked (e.g., the
Amazon app, or a Safari browser with a Land's End shopping cart), and establishes a secure session to finalize the payment to the corresponding vendor, with the user-selected card. As in the earlier examples, various digital data protocols can be employed to secure the transaction. (In this case, optical communication with the cooperating system is not used. Instead, data is exchanged with the remote system by digital communications, e.g., using a 4G network to the internet, etc.)
While the present technology's robustness to various potential attacks was noted above, the technology also addresses one of the largest fraud channels in the existing credit card system: so-called "card not present" transactions. Many charge transactions are made without presenting a physical card to a merchant. (Consider all online purchases.) If a person knows a credit card number, together with owner name, expiration date, and code on back, they can make a charge. Much fraud results. By the present technology, in contrast, the smartphone serves as the payment credential - the same credential for both online and bricks-and-mortar merchants. For the former its data is presented digitally, and for the latter its data is presented optically - both with reliable security safeguards. As smartphones become ubiquitous, merchants may simply insist on cash if a smartphone is not used, with negligibly few bona fide sales lost as a consequence.
It will be recognized that the detailed user interfaces are illustrative only. In commercial implementation, it is expected that different forms of interface will probably be used, based on the demands and constraints of the particular application. (One alternative form of interface is one in which a virtual representation of a wallet card is dragged and dropped onto an item displayed on-screen that is to be purchased, or is dragged/dropped onto a displayed form that then auto-completes with textual particulars (cardholder name, billing address, card number, etc.) corresponding to the selected card. Such forms of interaction may be particularly favored when using desktop and laptop computers.)
While the focus of the disclosure has been on payment transactions, another use of wallet cards is in identification transactions and authentication. There is no reason why driver licenses, passports and other identification documents cannot have virtual counterparts (or replacements) that employ the technology detailed herein. Again, greatly increased security can thereby be achieved.
Such virtual cards are also useful in self-service kiosks and other transactions. An example is checking into a hotel. While hotels routinely employ human staff to check-in guests, they do so not solely to be hospitable. Such human interaction also serves a security purpose - providing an exchange by which guests can be informally vetted, e.g., to confirm that their stated identity is bona fide. The present technology allows such vetting to be conducted in a far more rigorous manner. Many weary travelers would be pleased to check- in via a kiosk (presenting payment card and loyalty card credentials, and receiving a mag stripe-encoded, or RFID-based, room key in return), especially if it spared them a final delay in the day's travel, waiting for a human receptionist.
Similarly, air travel can be made more secure by authenticating travelers using the technologies detailed herein, rather than relying on document inspection by a bleary-eyed human worker at shift's end. Boarding passes can similarly be made more secure by including such documents in the virtual wallet, and authenticating their validity using the presently-detailed techniques.
In the embodiment detailed in Figs. 14 and 15, the relationship between the images was due to common geography and a common interval of time (a vacation trip to Napa). However, the relationship can be of other sorts, such as person-centric or thing-centric. For example, the reference image may be a close-up of a pair of boots worn by a friend of the user, and the related candidate images can be face shots of that friend. (Dummy images can be face shots of strangers.)
Embodiments that presented information for user review or challenge on the smartphone screen, and/or solicited user response via the smartphone keypad or touch screen, can instead be practiced otherwise. For example, information can be presented to the user on a different display, such as on a point of sale terminal display. Or it can be posed to the user verbally, as by a checkout clerk. Similarly, the user's response can be entered on a device different than the smartphone (e.g., a keypad at a checkout terminal), or the user may simply voice a responsive answer, for capture by a POS system microphone.
The artisan will recognize that spectrum-based analysis of signals (e.g., audio signals, as used above in one authentication embodiment) can be performed by filter banks, or by transforming the signal into the Fourier domain, where it is characterized by its spectral components.
As noted, security checks can be posed to the user at various times in the process, e.g., when the phone is awakened, when the payment app starts, when a card is selected, when payment is finalized, etc. The check may seek to authenticate the user, the user device, a computer with which the device is communicating, etc. The check may be required and/or performed by software in the device, or by software in a cooperating system. In addition to PIN and password approaches, these can include checks based on user biometrics, such as voice recognition and fingerprint recognition. In one particular embodiment, whenever the payment module is launched, a screen-side camera on the user's smartphone captures an image of the user's face, and checks its features against stored reference features for the authorized user to confirm the phone is not being used by someone else. Another form of check is the user's custody of a required physical token (e.g., a particular car key), etc.
Location information (e.g., GPS, cell tower triangulation, etc.) can also be utilized to confirm placement of the associated mobile device within proximity of the cooperating device. High confidence on location can be achieved by relying on network-provided location mechanism from companies such as Locaid, that are not susceptible to application hacking on the mobile device (enabled by unlocking the device or otherwise.)
If a smartphone transaction fails, e.g., because the context information provided from the smartphone to the cooperating system does not match what is expected, or because the user fails multiple consecutive attempts to provide a proper PIN code or pass another security check, a report of the failed transaction can be sent to the authorized user or other recipient. Such a report, e.g., by email or telephone, can include the location of the phone when the transaction failed, as determined by a location- sensing module in the phone (e.g., a GPS system). Although one focus of this disclosure has been on arrangements that make no use of plastic wallet cards, some of the technology is applicable to such cards.
For example, a plastic chip card can be equipped with one or more MEMS sensors, and these can be used to generate context-dependent session keys, which can then be used in payment transactions in the manners described above in connection with smartphones.
Moreover, plastic cards can also be useful in enrolling virtual cards in a smartphone wallet. One particular such technology employs interaction between printable conductive inks (e.g., of metal oxides), and the capacitive touch screens commonly used on smartphones and tablets. As detailed in publications by Printechnologics Gmbh and others, when a card printed with a pattern of conductive ink is placed on a touch screen, the touch screen senses the pattern defined by the ink and can respond accordingly. (See, e.g., patent publications WO2012136817, WO2012117046, US 2012-0306813 Al, US 2012-0125993 Al, US 2012-0306813 Al and US 2011-0253789 Al. Such technology is being commercialized under the Touchcode brand name.) Loading the card into the digital wallet can involve placing the mobile wallet software in an appropriate mode (e.g., "ingest"), after optional authentication has been completed. The user then places the physical card on the smartphone display. The use of conductive inks on the card serves to identify the card to the mobile device. The user can then lift the card off the display, leaving a virtualized representation of the card on the display to be subsequently stored in the wallet, with the opportunity to add additional metadata to facilitate transactions or preferences (PIN ' s , priority, etc . ) .
Such physical item-based interaction with touch screens can also be used, e.g., during a challenge-response stage of a transaction. For example, a cooperating device may issue a challenge through the touch- screen on the mobile device as an alternative to (or in addition to) audio, image, wireless, or other challenge mechanisms. In one particular arrangement, a user places a smartphone screen-down on a reading device (similar to reading a digital boarding-pass at TSA check-points). The cooperating device would have a static or dynamic electrical interconnect that could be used to simulate a multi-touch events on the mobile device. By so doing, the mobile device can use the challenge (presented as a touch event) to inform the transaction and respond appropriately to the cooperating device.
While reference has been made to smartphones and POS terminals, it will be recognized that this technology finds utility with all manner of devices - both portable and fixed. Tablets, portable music players, desktop computers, laptop computers, set-top boxes, televisions, wrist- and head-mounted systems and other wearable devices, servers, etc., can all make use of the principles detailed herein. (The term "smartphone" should be construed herein to encompass all such devices, even those that are not telephones.)
Particularly contemplated smartphones include the Apple iPhone 5; smartphones following Google's Android specification (e.g., the Galaxy S III phone, manufactured by Samsung, and the Motorola Droid Razr HD Maxx phone), and Windows 8 mobile phones (e.g., the Nokia Lumia 920).
Details of the Apple iPhone, including its touch interface, are provided in Apple's published patent application 20080174570.
Details of the Cover Flow fliptych interface used by Apple are provided in published patent application 20080062141.
The design of smartphones and other computers referenced in this disclosure is familiar to the artisan. In general terms, each includes one or more processors, one or more memories (e.g. RAM), storage (e.g., a disk or flash memory), a user interface (which may include, e.g., a keypad, a TFT LCD or OLED display screen, touch or other gesture sensors, a camera or other optical sensor, a compass sensor, a 3D magnetometer, a 3-axis accelerometer, a 3-axis gyroscope, one or more microphones, etc., together with software instructions for providing a graphical user interface), interconnections between these elements (e.g., buses), and an interface for communicating with other devices (which may be wireless, such as GSM, 3G, 4G, CDMA, WiFi, WiMax, Zigbee or Bluetooth, and/or wired, such as through an Ethernet local area network, a T-l internet connection, etc.).
The processes and system components detailed in this specification may be implemented as instructions for computing devices, including general purpose processor instructions for a variety of programmable processors, including microprocessors (e.g., the Intel Atom, the ARM A5, the nVidia Tegra 4, and the Qualcomm Snapdragon), graphics processing units (GPUs, such as the nVidia Tegra APX 2600, and the Adreno 330 - part of the Qualcomm Snapdragon processor), and digital signal processors (e.g., the Texas Instruments TMS320 series devices and OMAP series devices), etc. These instructions may be implemented as software, firmware, etc. These instructions can also be implemented in various forms of processor circuitry, including programmable logic devices, field programmable gate arrays (e.g., the Xilinx Virtex series devices), field programmable object arrays, and application specific circuits - including digital, analog and mixed analog/digital circuitry. Execution of the instructions can be distributed among processors and/or made parallel across processors within a device or across a network of devices. Processing of content signal data may also be distributed among different processor and memory devices. "Cloud" computing resources can be used as well. References to "processors," "modules" or "components" should be understood to refer to functionality, rather than requiring a particular form of implementation.
Software instructions for implementing the detailed functionality can be authored by artisans without undue experimentation from the descriptions provided herein, e.g., written in C, C++, Visual Basic, Java, Python, Tel, Perl, Scheme, Ruby, etc. In addition, libraries that allow mathematical operations to be performed on encrypted data can be utilized to minimize when and how sensitive information is stored in clear-text. Smartphones and other devices according to certain implementations of the present technology can include software modules for performing the different functions and acts.
Known browser software, communications software, and media processing software can be adapted for use in implementing the present technology.
Software and hardware configuration data/instructions are commonly stored as instructions in one or more data structures conveyed by tangible media, such as magnetic or optical discs, memory cards, ROM, etc., which may be accessed across a network. Some embodiments may be implemented as embedded systems -special purpose computer systems in which operating system software and application software are indistinguishable to the user (e.g., as is commonly the case in basic cell phones). The functionality detailed in this specification can be implemented in operating system software, application software and/or as embedded system software.
Different of the functionality can be implemented on different devices. For example, in a system in which a smartphone communicates with a computer at a remote location, different tasks can be performed exclusively by one device or the other, or execution can be distributed between the devices. Extraction of fingerprint and watermark data from content is one example of a process that can be distributed in such fashion. Thus, it should be understood that description of an operation as being performed by a particular device (e.g., a smartphone) is not limiting but exemplary; performance of the operation by another device (e.g., a remote server), or shared between devices, is also expressly contemplated.
(In like fashion, description of data being stored on a particular device is also exemplary; data can be stored anywhere: local device, remote device, in the cloud, distributed, etc. Thus, while an earlier embodiment employed user photographs stored in the phone, the detailed methods can similarly make use of user photographs stored in an online/cloud repository.)
Many of the sensors in smartphones are of the MEMS variety (i.e.,
Microelectromechanical Systems). Most of these involve tiny moving parts. Such components with moving parts may be termed motive-mechanical systems.
This specification details a variety of embodiments. It should be understood that the methods, elements and concepts detailed in connection with one embodiment can be combined with the methods, elements and concepts detailed in connection with other embodiments. While some such arrangements have been particularly described, many have not - due to the large number of permutations and combinations. However, implementation of all such combinations is straightforward to the artisan from the provided teachings.
Elements and teachings within the different embodiments disclosed in the present specification are also meant to be exchanged and combined. Section headings are provided merely for the reader' s convenience and should not be read as limiting the scope of the embodiments or disclosure. The teachings and elements under one heading may be readily combined with the elements and teachings under another.
While this disclosure has detailed particular ordering of acts and particular combinations of elements, it will be recognized that other contemplated methods may re-order acts (possibly omitting some and adding others), and other contemplated combinations may omit some elements and add others, etc.
Although disclosed as complete systems, sub-combinations of the detailed arrangements are also separately contemplated (e.g., omitting various of the features of a complete system).
The present specification should be read in the context of the cited references. (The reader is presumed to be familiar with such prior work.) Those references disclose technologies and teachings that the inventors intend to be integrated with the present technology.
While certain aspects of the technology have been described by reference to illustrative methods, it will be recognized that apparatuses configured to perform the acts of such methods are also contemplated as part of applicant's inventive work. Likewise, other aspects have been described by reference to illustrative apparatus, and the methodology performed by such apparatus is likewise within the scope of the present technology. Still further, tangible computer readable media containing instructions for configuring a processor or other programmable system to perform such methods is also expressly contemplated.
In view of the wide variety of embodiments to which the principles and features discussed above can be applied, it should be apparent that the detailed embodiments are illustrative only, and should not be taken as limiting the scope of the invention. Rather, we claim as our invention all such modifications as may come within the scope and spirit of the following claims and equivalents thereof.

Claims

What is claimed is:
1. A method employing a user' s portable device, the device including a display and a sensor module, the method including the acts:
presenting a payment user interface using the display, the user interface identifying plural virtual wallet cards including plural payment service cards, said payment service cards representing plural possible payment services including at least one of American Express, VISA and MasterCard, the user interface enabling a user to select a desired one of said payment services for issuing a payment;
generating context-based authentication data, the authentication data depending in part on data from the device sensor module;
presenting artwork using the display, the artwork indicating the selected payment service and including a logo for American Express, Visa, or MasterCard; and
providing information, including both the context-based authentication data and data corresponding to the selected payment service, from the device to a cooperating system, in connection with issuing a payment;
wherein:
the logo in the presented artwork confirms to the user that the desired payment service has been selected for a payment; and
the method enables the user to issue payments using a user-selected one of said plural payment services, without requiring the user to carry plural physical cards for said payment services.
2. The method of claim 1 in which the artwork includes a machine-readable
representation of said information, wherein said information is provided optically from the device to the cooperating system.
3. The method of claim 1 in which the user interface enables the user to select plural of said virtual wallet cards, one of the selected cards being a payment service card, and another of the selected cards being a merchant card, the method further including providing data
corresponding to both the payment service card and the merchant card to the cooperating system.
4. The method of claim 1 in which the authentication data depends in part on data from a sensor module selected from the group: an audio sensor, a motion sensor, a pose sensor, a barometric pressure sensor, and a temperature sensor.
5. The method of claim 1 in which the method further includes prompting the user for entry of correct validation information before the information is provided to the cooperating system, and wirelessly sending location data to a recipient if N consecutive attempts to enter correct validation information fail.
6. The method of claim 1 in which the authentication data is also user device-based, wherein the authentication data is logically bound to both context and to the user device.
7. The method of claim 1 that further includes presenting said payment user interface in response to user activation of a control in an online shopping user interface, through which the user has selected one or more items for purchase.
8. A method employing a user's portable device, the device including a display and a sensor, the method including acts of:
initiating a multi-party auction to solicit bids from a plurality of financial vendors to facilitate a financial transaction for the user, the plurality of remotely-located financial vendors being associated with the user via a virtual wallet hosted on the user's portable device;
receiving bids from the plurality of financial vendors;
presenting a user interface using the display, the user interface identifying at least two bids solicited from the multi-party auction;
upon receiving an indication of a user-selected bid from the at least two bids, initiating a financial transaction using at least some of the details in the user selected bid and information obtained from the virtual wallet.
9. The method of claim 8 in which the virtual wallet provides information associated with: i) the user, and ii) the financial transaction, to the plurality of financial vendors.
10. The method of claim 8 in which said initiating a multi-party auction commences with user input.
11. The method of claim 8 in which said initiating a multi-party auction commences upon analysis of GPS information.
12. The method of claim 8 in which the sensor comprises a microphone, and said initiating a multi-party auction commences upon analysis of microphone captured audio.
13. The method of claim 8 in which prior to said initiating a financial transaction, the method further comprises determining whether the financial transaction seems out of character relative to a baseline, in which the baseline includes user calendar information.
14. The method of claim 13 further comprising issuing a notification when the financial transaction seems out of character.
15. A method employing a user's portable device, the device including a display and a sensor, the method including acts of:
initiating a multi-party solicitation for offers from a plurality of financial vendors to facilitate a financial transaction for the user, the plurality of remotely-located financial vendors being associated with the user via a virtual wallet hosted on the user's portable device;
receiving offers from the plurality of financial vendors;
selecting one of the offers according to predetermined criteria, without human intervention at the time of said receiving;
presenting a user interface using the display, the user interface displaying information associated with the selected one offer; and
upon receiving an indication through the user interface, initiating a financial transaction using at least some of the details in the selected one offer and information obtained from the virtual wallet.
16. The method of claim 15 in which the predetermined criteria comprises weighting factors.
17. The method of claim 15 in which the time of said receiving comprises a time period in the range of .1 milsecond to 90 seconds before and after said receiving.
18. The method of claim 16 in which the weighting factors consider credit history of a user.
19. The method of claim 15 in which said initiating a multi-party solicitation commences upon analysis of GPS information.
20. The method of claim 19 in which the sensor comprises a microphone, and said initiating a multi-party solicitation commences upon analysis of microphone captured audio.
21. The method of claim 15 in which prior to said initiating a financial transaction, the method further comprises determining whether the financial transaction seems out of character relative to a baseline, in which the baseline includes user calendar information.
22. A portable device comprising:
a touch screen display;
a microphone for capturing ambient audio;
memory for storing audio identifiers or information obtained from audio identifiers; and one or more processors configured for:
causing the portable device to operate in a background audio collection mode, in during the mode audio is captured by the microphone without user involvement; processing audio captured in the background audio collection mode to yield one or more audio identifiers;
storing the one or more audio identifiers or information obtained from the identifiers in said memory; upon encountering a transmission from a signaling source, determining if the one or more audio identifiers or if the information obtained from the one or more identifiers stored in memory corresponds to the transmission;
taking an action if there is a correspondence.
23. The portable device of claim 22 in which the signaling source comprises an iBeacon or Bluetooth transmitter.
24. The portable device of claim 23 in which the information obtained from the one or more audio identifiers comprises a discount code or coupon, and in which the action comprises applying the discount code or coupon to a financial transaction involving the portable device.
25. The portable device of claim 22 in which the processing audio comprises extracting fingerprints from the audio.
26. The portable device of claim 22 in which the processing audio comprises decoding digital watermarking hidden in the audio.
27. The portable device of claim 22 in which the action comprises prompting the user via a message displayed on the touch screen display.
28. The portable device of claim 22 in which the action comprise displaying an in- store map or directions to a product associated with the one or more identifiers.
29. A portable device comprising:
a touch screen display;
a microphone for capturing ambient audio;
memory for storing an image; and
one or more processors configured for:
generating copies of the stored image;
obtaining a payload corresponding to financial information; providing the payload to an erasure code generator, in
which the erasure code generator produces a plurality of outputs;
embedding one of the plurality of outputs in a copy of the stored image and proceeding with embedding until each of the plurality of outputs is so embedded in a copy of the stored image, in which the embedding utilizes digital watermarking;
causing the touch screen display to display embedded image copies so as to cause a static image display effect, the displayed embedded image copies being displayed by the portable device in response to a user input to enable a financial transaction.
30. The portable device of claim 29 in which the obtaining comprises generating the payload based on user input and on the financial information.
31. The portable device of claim 29 in which said one or more processors are configured to operate as the erasure code generator, and in which the erasure code generator comprises a fountain code generator, in which the fountain code generator produces the plurality of outputs, from which a receiver can reassemble the payload by obtaining a subset of the plurality of outputs, the subset being less than the plurality of outputs.
32. The portable device of claim 29 in which only one output of the plurality of outputs is embedded in any one image copy.
33. The portable device of claim 29 in which said one or more processors are configured for: i) generating a perceptibility map of the image, ii) storing the perceptibility map in said memory, and iii) reusing the perceptibility map when embedding the plurality of outputs in corresponding image copies.
34. The portable device of claim 29 further comprising an audio transmitter, in which said one or more processors are configured to cause said audio transmitter to transmit an audio signal corresponding to the financial information.
35. The portable device of claim 34 in which said audio transmitter comprises a high frequency audio transmitter.
36. The portable device of claim 35 in which the audio signal comprises a pin, key or hash.
37. The portable device of claim 29 in which the plurality of outputs comprises a subset of a total number of outputs provided by the erasure code generator.
38. The portable device of claim 29 in which said one or more processors are configured to interpret the user input which is received via said touch screen display.
39. The portable device of claim 29 in which said one or more processors are configured to cause the embedded image copies to be displayed so that a digital watermark reader analyzing captured image data representing the display can recover the payload.
EP14709848.7A 2013-02-26 2014-02-26 Methods and arrangements for smartphone payments and transactions Ceased EP2962262A4 (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US201361769701P 2013-02-26 2013-02-26
US13/792,764 US9965756B2 (en) 2013-02-26 2013-03-11 Methods and arrangements for smartphone payments
US13/873,117 US9830588B2 (en) 2013-02-26 2013-04-29 Methods and arrangements for smartphone payments
US201361825059P 2013-05-19 2013-05-19
US14/074,072 US20140258110A1 (en) 2013-03-11 2013-11-07 Methods and arrangements for smartphone payments and transactions
US201461938673P 2014-02-11 2014-02-11
US14/180,277 US9311640B2 (en) 2014-02-11 2014-02-13 Methods and arrangements for smartphone payments and transactions
PCT/US2014/018715 WO2014134180A2 (en) 2013-02-26 2014-02-26 Methods and arrangements for smartphone payments and transactions

Publications (2)

Publication Number Publication Date
EP2962262A2 true EP2962262A2 (en) 2016-01-06
EP2962262A4 EP2962262A4 (en) 2016-08-24

Family

ID=50272764

Family Applications (1)

Application Number Title Priority Date Filing Date
EP14709848.7A Ceased EP2962262A4 (en) 2013-02-26 2014-02-26 Methods and arrangements for smartphone payments and transactions

Country Status (3)

Country Link
EP (1) EP2962262A4 (en)
CN (1) CN105190659B (en)
WO (1) WO2014134180A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11551208B2 (en) 2018-10-04 2023-01-10 Verifone, Inc. Systems and methods for point-to-point encryption compliance

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965756B2 (en) 2013-02-26 2018-05-08 Digimarc Corporation Methods and arrangements for smartphone payments
US9311640B2 (en) 2014-02-11 2016-04-12 Digimarc Corporation Methods and arrangements for smartphone payments and transactions
US9830588B2 (en) 2013-02-26 2017-11-28 Digimarc Corporation Methods and arrangements for smartphone payments
EP3063608B1 (en) 2013-10-30 2020-02-12 Apple Inc. Displaying relevant user interface objects
US11461766B1 (en) 2014-04-30 2022-10-04 Wells Fargo Bank, N.A. Mobile wallet using tokenized card systems and methods
US9652770B1 (en) 2014-04-30 2017-05-16 Wells Fargo Bank, N.A. Mobile wallet using tokenized card systems and methods
US10043185B2 (en) 2014-05-29 2018-08-07 Apple Inc. User interface for payments
EP3162156B1 (en) 2014-06-27 2020-04-22 Techflux Inc. Method and device for transmitting data unit
WO2016036552A1 (en) 2014-09-02 2016-03-10 Apple Inc. User interactions for a mapping application
KR101697599B1 (en) * 2015-06-04 2017-01-18 엘지전자 주식회사 Mobile terminal and method for controlling the same
US9940637B2 (en) 2015-06-05 2018-04-10 Apple Inc. User interface for loyalty accounts and private label accounts
US20160358133A1 (en) 2015-06-05 2016-12-08 Apple Inc. User interface for loyalty accounts and private label accounts for a wearable device
US10275116B2 (en) 2015-06-07 2019-04-30 Apple Inc. Browser with docked tabs
US9955303B2 (en) 2015-07-21 2018-04-24 IP Funding Group, LLC Determining relative position with a BLE beacon
KR20170011181A (en) * 2015-07-21 2017-02-02 엘지전자 주식회사 Mobile terminal and paying method using extended display and finger scan thereof
KR102576420B1 (en) 2016-01-15 2023-09-08 삼성전자 주식회사 Method and device for displaying indication of payment methods
CN107067251B (en) * 2016-01-25 2021-08-24 苹果公司 Conducting transactions using electronic devices with geographically limited non-local credentials
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
WO2017219269A1 (en) * 2016-06-22 2017-12-28 北京小米移动软件有限公司 Method and device for activating virtual card
CN106170809B (en) 2016-06-22 2020-09-01 北京小米支付技术有限公司 Virtual card display method and device
CN107665440A (en) * 2016-07-28 2018-02-06 腾讯科技(深圳)有限公司 Credit accounts system of selection and device
CN106921728A (en) 2016-08-31 2017-07-04 阿里巴巴集团控股有限公司 A kind of method for positioning user, information-pushing method and relevant device
US9842330B1 (en) 2016-09-06 2017-12-12 Apple Inc. User interfaces for stored-value accounts
RU2752007C2 (en) * 2017-03-23 2021-07-21 Мастеркард Интернэшнл Инкорпорейтед Digital wallet for supply and administration of tokens
WO2018221935A1 (en) * 2017-05-29 2018-12-06 Lg Electronics Inc. Mobile terminal and method of controlling same
EP4156129A1 (en) 2017-09-09 2023-03-29 Apple Inc. Implementation of biometric enrollment
KR102185854B1 (en) 2017-09-09 2020-12-02 애플 인크. Implementation of biometric authentication
CN107680608B (en) * 2017-09-27 2020-09-11 天津大学 Fountain code-based fragile watermark self-recovery method
CN110068328B (en) * 2018-01-22 2022-08-26 腾讯科技(深圳)有限公司 Navigation route generation method and device, computer readable storage medium and computer equipment
CN113421087A (en) * 2018-06-12 2021-09-21 创新先进技术有限公司 Payment processing method and device and server
CN109615379B (en) * 2018-10-24 2023-04-21 创新先进技术有限公司 Generating method and device of rejection processing system
CN109598668B (en) * 2018-12-05 2023-03-14 吉林大学 Touch form digital watermark embedding and detecting method based on electrostatic force
US11328352B2 (en) 2019-03-24 2022-05-10 Apple Inc. User interfaces for managing an account
US11551190B1 (en) 2019-06-03 2023-01-10 Wells Fargo Bank, N.A. Instant network cash transfer at point of sale
IT201900019241A1 (en) * 2019-10-18 2021-04-18 Gaetano Rizzi METHOD AND SYSTEM FOR THE CONTROL OF ELECTRONIC PAYMENTS.
CN110991234A (en) * 2019-10-29 2020-04-10 深圳市龙岳科技有限公司 Face recognition equipment and auxiliary authentication method
CN110807502B (en) * 2019-10-31 2024-04-09 天星数科科技有限公司 NFC intelligent card configuration method and device
CN110866580B (en) * 2019-10-31 2024-01-16 天星数科科技有限公司 Preprocessing method and device for configuring NFC smart card
US11829499B2 (en) * 2020-03-26 2023-11-28 Bank Of America Corporation Securing PIN information using obfuscation by applying extra security layer
US11206544B2 (en) * 2020-04-13 2021-12-21 Apple Inc. Checkpoint identity verification on validation using mobile identification credential
US11314395B2 (en) 2020-05-29 2022-04-26 Apple Inc. Sharing and using passes or accounts
JP6889892B1 (en) * 2020-12-21 2021-06-18 株式会社Kpiソリューションズ Management system, server equipment, programs and methods
CN114979744B (en) * 2021-02-25 2024-03-19 腾讯科技(深圳)有限公司 Interface processing method, device, server and storage medium
US11981181B2 (en) 2021-04-19 2024-05-14 Apple Inc. User interfaces for an electronic key
CN113489763B (en) * 2021-06-18 2023-11-21 深圳软牛科技有限公司 Method, device, equipment and storage medium for closing and searching My mobile phone function
US11556264B1 (en) 2021-07-26 2023-01-17 Bank Of America Corporation Offline data transfer between devices using gestures
US20230063333A1 (en) * 2021-08-30 2023-03-02 Mastercard International Incorporated Data analysis to determine offers made to credit card customers
US11995621B1 (en) 2021-10-22 2024-05-28 Wells Fargo Bank, N.A. Systems and methods for native, non-native, and hybrid registration and use of tags for real-time services
US20230230067A1 (en) * 2022-01-20 2023-07-20 VocaLink Limited Tokenized control of personal data
CN116868740A (en) * 2023-06-30 2023-10-13 广东环境保护工程职业学院 Plant maintenance method, device, system and medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8285639B2 (en) * 2005-07-05 2012-10-09 mConfirm, Ltd. Location based authentication system
CN1893663A (en) * 2005-09-02 2007-01-10 华为技术有限公司 Transmission protection method of multi-media communication
US8031207B2 (en) * 2008-06-04 2011-10-04 Mastercard International, Inc. Card image description format to economize on data storage
US10839384B2 (en) * 2008-12-02 2020-11-17 Paypal, Inc. Mobile barcode generation and payment
CN101702640B (en) * 2009-10-15 2013-03-20 北京网御星云信息技术有限公司 Method and device for transmitting data in unidirectional network
US9258715B2 (en) * 2009-12-14 2016-02-09 Apple Inc. Proactive security for mobile devices
US9691055B2 (en) * 2010-12-17 2017-06-27 Google Inc. Digital wallet
US8725652B2 (en) * 2011-03-29 2014-05-13 Visa International Service Association Using mix-media for payment authorization
EP2707847A4 (en) * 2011-05-10 2015-04-01 Dynamics Inc Systems, devices, and methods for mobile payment acceptance, mobile authorizations, mobile wallets, and contactless communication mechanisms
SG195079A1 (en) * 2011-06-03 2013-12-30 Visa Int Service Ass Virtual wallet card selection apparatuses, methods and systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11551208B2 (en) 2018-10-04 2023-01-10 Verifone, Inc. Systems and methods for point-to-point encryption compliance

Also Published As

Publication number Publication date
WO2014134180A2 (en) 2014-09-04
EP2962262A4 (en) 2016-08-24
CN105190659A (en) 2015-12-23
WO2014134180A3 (en) 2015-01-08
CN105190659B (en) 2021-02-05

Similar Documents

Publication Publication Date Title
US11049094B2 (en) Methods and arrangements for device to device communication
CN105190659B (en) Method, apparatus and arrangement for device-to-device communication
US20140244514A1 (en) Methods and arrangements for smartphone payments and transactions
US9830588B2 (en) Methods and arrangements for smartphone payments
US20140258110A1 (en) Methods and arrangements for smartphone payments and transactions
US9965756B2 (en) Methods and arrangements for smartphone payments
US11563583B2 (en) Systems and methods for content management using contactless cards
US8977234B2 (en) Using low-cost tags to facilitate mobile transactions
US9721237B2 (en) Animated two-dimensional barcode checks
US20180189767A1 (en) Systems and methods for utilizing payment card information with a secure biometric processor on a mobile device
AU2021376233B2 (en) Initiating a device security setting on detection of conditions indicating a fraudulent capture of a machine-readable code
US20140100973A1 (en) Smartphone virtual payment card
US20130290707A1 (en) Information distribution system
US11151562B2 (en) Secure passcode entry using mobile device with augmented reality capability
KR20120116902A (en) A personalized multifunctional access device possessing an individualized form of authenticating and controlling data exchange
KR20170096940A (en) Encrypted electronic gaming ticket
JP2007323249A (en) Settlement system
TWI474271B (en) Electronic payment system
AU2013201574B1 (en) An information distribution system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150921

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: SHARMA, RAVI, K.

Inventor name: DAVIS, BRUCE, L.

Inventor name: FILLER, TOMAS

Inventor name: MACINTOSH, BRIAN, T.

Inventor name: RODRIGUEZ, TONY, F.

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20160721

RIC1 Information provided on ipc code assigned before grant

Ipc: G06Q 30/06 20120101AFI20160715BHEP

17Q First examination report despatched

Effective date: 20180221

REG Reference to a national code

Ref country code: DE

Ref legal event code: R003

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20200505