AU2016101802A4 - Data mesh platform - Google Patents

Data mesh platform Download PDF

Info

Publication number
AU2016101802A4
AU2016101802A4 AU2016101802A AU2016101802A AU2016101802A4 AU 2016101802 A4 AU2016101802 A4 AU 2016101802A4 AU 2016101802 A AU2016101802 A AU 2016101802A AU 2016101802 A AU2016101802 A AU 2016101802A AU 2016101802 A4 AU2016101802 A4 AU 2016101802A4
Authority
AU
Australia
Prior art keywords
user
module
activity
data
attribute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2016101802A
Inventor
Robert Lee
Ryan Melcher
John Tapley
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eBay Inc
Original Assignee
eBay Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2015/022318 external-priority patent/WO2015148559A1/en
Application filed by eBay Inc filed Critical eBay Inc
Priority to AU2016101802A priority Critical patent/AU2016101802A4/en
Application granted granted Critical
Publication of AU2016101802A4 publication Critical patent/AU2016101802A4/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

In various example embodiments, a system and method for a data mesh platform are presented. Attribute data associated with a user is accessed from a plurality of attribute sources. An analysis of the attribute data is performed. 5 Based on the analysis, an action corresponding to the user is performed.

Description

DATA MESH PLATFORM
RELATED APPLICATIONS
[0001] This international application claims the benefit of priority to U.S. Provisional Application No. 61/970,263, filed March 25, 2014, U.S. Patent Application No. 14/449,113, filed July 31, 2014, U.S. Patent Application No. 14/449,126, filed July 31, 2014, and to U.S. Patent Application No. 14/498,326, filed September 26, 2014, the entire contents of which are hereby incorporated by reference herein in their entirety.
TECHNICAL FIELD
[0002] Embodiments of the present disclosure relate generally to a data mesh platform.
BACKGROUND
[0003] In recent years, mobile devices, wearable devices, smart devices, and the like have pervaded nearly every aspect of modern life. Such devices are increasingly incorporating sensors to monitor everything from the moisture level of houseplants to the dribbling of a basketball. Network-connected devices like these are capable of providing a near real-time and constant data feed. These trends have provided a vast amount of rich, constantly updated data.
EXAMPLES
[0004] To better illustrate the apparatus and methods disclosed herein, a non-limiting list of examples is provided here: [0005] Example 1. A system comprising: an attribute module to receive attribute data associated with a user from a plurality of attribute sources; an item module to extract demand indications from the attribute data, the demand indications being indicative of anticipatory demand by the user for a particular item; an analysis module, implemented by a hardware processor of a 1 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 machine, to identify a pertinent item from the attribute data based on the extracted demand indications; a characteristic module to infer user characteristics pertaining to the user from the attribute data; and an order module to determine transaction parameters for a suggested transaction associated with 5 the pertinent item based, at least in part, on the user characteristics and facilitating the suggested transaction according to the determined transaction parameters.
[0006] Example 2. The system of example 1, wherein the at least one order parameter includes at least one of a quantity, a delivery time, a payment 10 time, a delivery method, a delivery destination, a merchant, or a product.
[0007] Example 3. A method comprising: receiving attribute data associated with a user from a plurality of attribute sources; extracting demand indications from the attribute data, the demand indications being indicative of anticipatory demand by the user for a particular item; identifying, using a 15 hardware processor of a machine, a commerce item from the attribute data based on the extracted demand indications; inferring user characteristics pertaining to the user from the attribute data; determining order parameters for a user purchase associated with the commerce item based, at least in part, on the inferred user characteristics; and facilitating the user purchase according to the determined 20 order parameters.
[0008] Example 4. The method of example 3, wherein the at least one order parameter includes at least one of a quantity, a delivery time, a payment time, a delivery method, a delivery destination, a merchant, or a product, [0009] Example 5. The method of example 3, further comprising: 25 extracting a current inventory level of the commerce item from the attribute data; determining an inventory threshold for the commerce item by modeling usage of the commerce item based on the extracted current inventory level and the inferred user characteristics; identifying a mismatch between the inventory threshold and the current inventory level; and based on the mismatch, 30 automatically performing the user purchase on behalf of the user.
[0010] Example 6. The method of example 3, further comprising: identifying a purchase motive of the user for the commerce item by analyzing the inferred user characteristics, the purchase motive corresponding to a motive time; determining temporal order parameters, included in the order parameters, PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 based on the motive time; and facilitating the user purchase according to the determined temporal order parameters.
[0011] Example 7. The method of example 3, further comprising: identifying similar users, from among a plurality of other users, that are similar 5 to the user based on the inferred user characteristics and respective user characteristics of the plurality of other users; and determining the order parameters based on the user characteristics of the identified similar users.
[0012] Example 8. The method of example 3, further comprising: accessing purchase criteria corresponding to the user; and automatically 10 purchasing the commerce item on behalf of the user according to the purchase criteria.
[0013] Example 9. The method of example 8, wherein the purchase criteria include at least one criterion corresponding to a budget; and wherein the automatical ly purchasing the co mmerce item on behalf of the user is based, at 15 least in part, on the budget.
[0014] Example 10. The method of example 8, further comprising: determining an item category' for the commerce item, the purchase criteria including criteria corresponding to the item category; and facilitating the user purchase of the commerce item according to the purchase criteria corresponding 20 to the determined item category, [0015] Example 11. Tire method of example 3, further comprising: generating a notification that includes an option to make the user purchase, the notification including the determined order parameters; causing presentation of the notification to the user; receiving a user selection of the option to make the 25 user purchase; and responsive to receiving the user selection, performing the user purchase according to the determined order parameters.
[0016] Example 12. The method of example 11, further comprising: identifying presentation parameters for presentation of the notification to the user based on the inferred user characteristics, the presentation parameters 30 including a presentation time and a presentation device; and causing presentation of the notification according to the presentation parameters.
[0017] Example 13, The method of example 11, further comprising: adapting the presentation of the notification to the user based, at least in part, on the inferred user characteristics. .) PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [0018] Example 14. The method of example 11, further comprising: detecting a trigger action of the user based on real -time data included in the attribute data; and based on the detected trigger action, causing presentation of the notification to the user. 5 [0019] Example 15. The method of example 3, further comprising: calculating a demand metric for the commerce item based on the demand indications corresponding to the commerce item; and facilitating the user purchase associated with the commerce item based, at least in part, on the demand metric. 10 [0020] Example 16. The method of example 15, further comprising: automatically performing the user purchase on behalf of the user based on the demand metric exceeding a threshold.
[0021] Example 17. The method of example 15, further comprising: based on demand metric exceeding a threshold, generating a notification 15 providing the user an option to purchase the commerce item, the notification including the determined order parameters; and causing presentation of the notification to the user.
[0022] Example 18. A machine readable medium storing instructions that, when executed by at least one processor of a machine, cause the machine to 20 perform operations comprising: receiving attribute data associated with a user from a plurality of attribute sources; extracting demand indications from the attribute data, the demand indications being indicative of anticipatory demand by the user for a particular item; identifying an item from the attribute data based on the extracted demand indications; inferring user characteristics pertaining to the 25 user from the attribute data; determining order parameters for a user purchase associated with the commerce item based, at least in part, on the inferred user characteristics; and facilitating the user purchase according to the determined order parameters.
[0023] Example 19. The machine-readable medium of example 18, 30 wherein the at least one order parameter includes at least one of a quantity, a delivery time, a payment time, a delivery method, a delivery destination, a merchant, or a product.
[0024] Example 20. The machine-readable medium of example 18, wherein the operations further comprise: extracting a current inventory level of 4 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the commerce item from the attribute data; determining an inventory threshold for the commerce item by modeling usage of the commerce item based on the extracted current inventory level and the inferred user characteristics; identifying a mismatch between the inventory threshold and the current inventory level; and 5 based on the mismatch, automatically performing the user purchase on behalf of the user, [0025] Example 21. A system comprising: an attribute module to receive attribute data associated with a user from a plurality of attribute sources, the attribute data including real-time data; an authentication module, 10 implemented by a hardware processor of a machine, to identify' a portion of the real-time data indicative of user identification, and to authenticate the user identification with respect to the real-time data based on an analysis of the identified portion of the real-time data; and an ac tivity module to, in response to the analysis of the portion of the real-time data authenticating the user 15 identification, identify a user objective being advanced by the user based on the real-time data, and to augment an environment of the user based on a user setting to facilitate advancement towards the user objective.
[8026] Example 22, The system of example 21, wherein the authentication module is further to: calculate an identity likelihood metric based 20 on the real-time data, the identity likelihood metric indicating a likelihood of authentication of the user identification with respect to the real-time data; and authenticate the user identification based on the identity likelihood metric exceeding a threshold.
[8827] Example 23, The system of example 22, wherein the 25 authentication module is further to identify, from the real-time data, a portable device that corresponds to the user, wherein the identity likelihood metric is based on the identified portable device.
[8028] Example 24. The system of example 2.2, wherein the real-time data includes sensor data corresponding to the user and wherein the identify/ 30 likelihood metric is based in part on the sensor data.
[0029] Example 25, The system of example 22, wherein the authentication module is further to: extract past identification indications from past attribute data that corresponds to the user, wherein the attribute data includes the past attribute data; extract real-time identification indications from 5 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the real-time data corresponding to the user; and calculate the identity likelihood metric by correlating the real-time identification indications with the past identification indications.
[0030] Example 26. A method comprising: receiving attribute data 5 associated with a user from a plurality of attribute sources, the attribute data including real -time data; identifying a portion of the real-time data indicati ve of an identity of the user; authenticating, using a hardware processor of a machine, the identity of the user with respect to the real-time data by analyzing the identified portion of the real-time data; based on the authenticating of the 10 identity of the user, identifying a user activity being performed by the user based on the real-time data; and augmenting the user activity according to a user setting.
[0031] Example 27. The method of example 26, further comprising: calculating an identity likelihood metric based on the identified portion of the 15 real-time data, the identity likelihood metric indicating a likelihood of authentication of the identity of the user with respect to the real-time data; and authenticating the identity of the user based on the identity likelihood metric exceeding a threshold.
[0032] Example 28. The method of example 27, further comprising: 20 identifying, from the real-time data, a portable device corresponding to the user, wherein the identity likelihood metric is based on the identified portable device.
[0033] Example 29. The method of example 27, wherein the real-time data includes sensor data corresponding to the user and wherein the identity likelihood metric is based, at least in part, on the sensor data. 25 [0034] Example 30. The method of example 27, further comprising: extracting past identification indications from past attribute data corresponding to the user, wherein the attribute data includes the past attribute data; extracting real-time identification indications from the real-time data corresponding to the user; and calculating the identity likelihood metric by correlating the real-time 30 identification indications with the past identification indications.
[0035] Example 31. The method of example 27, further comprising: adjusting a security level of an authorization task based on the identity likelihood metric, the user activity including the authorization task.
[0036] Example 32. The method of example 31, wherein the adjusting 6 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the security level of the authorization task includes automatically performing the authorization task on beha lf of the user, [0037] Example 33. T he method of example 26, further comprising: determining the user setting based on the attribute data and the user activity; and 5 augmenting the user activity according to the determined user setting.
[0038] Example 34. The method of example 33, further comprising: inferring user characteristics based on an analysis of a portion of the attribute data; determining the user setting based on the inferred user characteristics and the user activity; and augmenting the user activity according to the determined 10 user setting.
[0039] Example 35. The method of example 34, further comprising: identifying similar users that are similar to the user based on the inferred user characteristics and respective user characteristics of the similar users; and determining the user setting based on the user characteristics of the identified 15 similar users.
[0040] Example 36. The method of example 26, further comprising: determining the user activity includes presentation of a user interface to the user: identifying presentation devices available to the user based on the attribute data, the presentation devices being capable of presenting the user interface to the 20 user; determining an alternate presentation device from among the identified presentation devices based on the user setting; and causing presentation of the user interface to the user on the alternate presentation device.
[0041] Example 37. The method of example 26, further comprising: determining a current location of the user based on the real-time data; accessing 25 device location data included in the attribute data; identifying user devices within an operating distance of the user based on the current location of the user and the device location data; and augmenting operation of the identified user devices according to the user setting.
[0042] Example 38. The method of example 26, further comprising: 30 identifying a user action of the user based on the real-time data, the user action being in response to the augmented user activity; inferring an augmentation result of employing the user setting to augment the user activity; and storing the augmentation result to be used subsequently to determine the user setting.
[0043] Example 39. A machine readable medium having no transitory PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 signals and storing instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising: receiving attribute data associated with a user from a plurality of attribute sources, the attribute data including real-time data; identifying a portion of the real-time data 5 indicative of an identity of the user; authenticating the identity of the user with respect to the real-time data by analyzing the identified portion of the real-time data; based on the authenticating of the identity of the user, identifying a user activity being performed by the user based on the real-time data; and modifying the user activity according to a user setting. 10 [0044] Example 40. T he machine-readable medium of example 39, wherein the operations further comprise: calculating an identity likelihood metric based on the identified portion of the real-time data, the identity likelihood metric indicating a likelihood of authentication of the identify? of the user with respect to the real-time data; and authenticating the identify' of the user based on 15 the identity likelihood metric exceeding a threshold.
[8045] Example 41. A system comprising: an attribute module to receive attribute data associated with a user from a plurality of attribute sources; a characteristic module to infer user characteristics based on an analysis of at leas t a portion of the attribute data; a visualization module, implemented by a 20 hardware processor of a machine, to generate a visualization based at least in part on the user characteristic, the visualization being representative of the attribute data; and a presentation module to cause presentation of the visualization to the user.
[0046] Example 42. The system of example 41, wherein the 25 presentation module is further to receive user input indicating a change to the visualization, the visualization module further to update the visualization according to the change indicated by the user input, and the characteristic module further to infer subsequent user characteristics based, at least in part, on the user input. 30 [0047] Example 43. The system of example 42, wherein the user input comprises a user interaction with the visualization.
[0048] Example 44. The system of example 41, further comprising: an analysis module to identify similar users from among a plurality of other users that are similar to the user based on the inferred user characteristics and 8 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 respective user charac teristics of the plurality of other users; and the visualization module further to generate the visualization based on the user characteristics of the identified similar users.
[0049] Example 45. The system of example 41, further comprising an 5 analysis module to: determine satisfaction of a reward criteria associated with the attribute data; and provide a reward to the user based on the determined satisfac tion of the reward criteria, wherein the reward comprises a visualization feature.
[0050] Example 46. The system of example 45, wherein the analysis 10 module is further to: calculate a completeness metric based on the attribute data, the reward criteria including a criterion based on the completeness metric.
[0051] Example 47. The system of example 46, wherein the completeness metric indicates a quantity of a specified type of data included in the attribute data and wherein the criterion based on the completeness metric is 15 satisfied by the user providing the specified type of data.
[0052] Example 48. The system of example 45, wherein the analysis module is further to: calculate a quality metric based on the attribute data, the reward criterion including a criterion based on the quality metric.
[0053] Example 49. The system of example 48, wherein the quality 20 metric indicates a recentness of the attribute data and wherein the criterion based on the quality metric is satisfied by the user providing recent data, [0054] Example 50. A method comprising: receiving attribute data associated with a user from a plurality of attribute sources; inferring user characteristics pertaining to the user from the attribute data, the user 25 characteristics including a physical characteristic of the user; generating, using a hardware processor of a machine, an avatar representative of the user based on the inferred user characteristics, the avatar including an avatar trait corresponding to the physical characteristic of the user; and causing presentation of the visualization to the user. 30 [0055] Example 51. The method of example 50, further comprising: receiving user input indicating a change to the avatar; and updating the avatar according to the change indicated by the user input, subsequent inferred user characteristics being inferred based, at least in part, on the user input.
[0056] Example 52. The method of example 51, wherein the user input 9 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 comprises a user interaction with the avatar.
[8057] Example 53. The method of example 50, further comprising: identifying similar users, from among a plurality of other users, that are similar to the user based on the inferred user characteristics and respective user 5 characteristics of the plurality of other users; and determining the avatar trait based on the user characteristics of the identified similar users.
[0058] Example 54. The method of example 50, further comprising: determining satisfaction of a reward criteria associated with the attribute data; and providing a reward to the user based on the determined satisfaction of the 10 reward criteria, wherein the reward comprises an avatar feature.
[0059] Example 55. The method of example 54, further comprising: calculating a completeness metric based on the attribute data, the reward criteria including a criterion based on the completeness metric.
[0068] Example 56. The method of example 55, wherein the 15 completeness metric indicates a quantity of a specified type of data included in the attribute data and wherein the criterion based on the completeness metric is satisfied by the user providing the specified type of data.
[0061] Example 57. The method of example 54, further comprising: calculating a qualify metric based on the attribute data, the reward criterion 20 including a criterion based on the quality metric.
[0062] Example 58. The method of example 57, wherein the quality metric indicates a recentness of the attribute data and wherein the criterion based on the quality? metric is satisfied by? the user providing recent data.
[0063] Example 59. The method of example 54, further comprising: 25 identifying similar users from among a plurality? of other users that are similar to the user based on the inferred user characteristics and respective user characteristics of the plurality of other users; and determining satisfaction of the reward criteria based on the attribute data associated with the identified similar users. 30 [0064] Example 60. A machine readable medium storing instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising: receiving attribute data associated with a user from a plurality of attribute sources; inferring user characteristics from the attribute data, the user characteristics pertaining to the user; generating a user 10 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 interface including an avatar representative of the user based on the inferred user characteristics, the user characteristics including a physical characteristic of the user, the avatar including an avatar trait corresponding to the physical characteristic of the user; and causing presentation of the user interface in a 5 device of the user.
BRIEF DESCRIPTION OF THE DRAWINGS
[0065] Various ones of the appended drawings merely illustrate example embodiments of the present disclosure and should not be considered as limiting 10 its scope.
[0066] FIG. 1 is a block, diagram illustrating a networked system, according to some example embodiments.
[0067] FIG. 2 is a block diagram illustrating an example embodiment of a data mesh system, according to some example embodiments. 15 [0068] FIG. 3 is a block diagram illustrating an example embodiment of an ancillary activity system, according to some example embodiments.
[0069] FIG. 4 illustrates an example of generating ancillary activity for a secondary user device, according to some example embodiments.
[0070] FIG. 5 is a flow diagram illustrating an example method for 20 generating ancillary’ activity for a secondary user device, according to some example embodiments.
[0071] FIG. 6 is a flow diagram illustrating further operations for inferring a user preference from the attribute data, according to some example embodiments. 25 [0072] FIGS. 7 and 8 are flow diagrams illustrating further operations for facilitating identifying a secondary user device, according to some example embodiments, [0073] FIGS. 9 and 10 are flow diagrams illustrating further operations for generating an ancillary activity for a secondary user device, according to some 30 example embodiments.
[0074] FIG, 11 illustrates an example scene showing a presentation of ancillary activity to the user, according to some example embodiments.
[0075] FIGS. 12 and 13 depict example user interfaces that present ancillary activity, according to some example embodiments. 11 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [0076] FIG. 14 is a block diagram illustrating an example embodiment of a user analytics system, according to some example embodiments.
[0077] FIG. 1.5 is a flow diagram illustrating an example method for identifying an item and facilitating a purchase associated with the identified 5 item, according to some example embodiments.
[0078] FIG. 16 is a flow diagram illustrating further operations for facilitating the purchase based, at least in part, on an evaluation of an inventory' level, according to some example embodiments.
[0079] FIG, 17 is a flow diagram illustrating further operations for 10 facilitating the purchase including operations to determine parameters for the purchase, according to some example embodiments.
[0(580] FIG. 18 is a flow' diagram illustrating further operations for determining order parameters including operations to determine a temporal parameter associated with the purchase, according to some example 15 embodiments.
[0081] FIG. 19 is a flow diagram illustrating further operations to facilitate the purchase based, at least in part, on purchase criteria, according to some example embodiments.
[0082] FIG. 20 is a flow- diagram illustrating a further example method for 20 identifying an item and facilitating a purchase, according to some example embodiments.
[0083] FIG. 21 is a flow diagram illustrating an alternative example method for identifying an item and facilitating a purchase, according to some example embodiments. 25 [0(584] FIG. 22 is a flow diagram illustrating further operations to facilitate the purchase based, at least in part, on a demand metric, according to some example embodiments.
[0085] FIG. 23 is a flow- diagram illustrating further operations to facilitate the purchase using a notification, according to some example embodiments. 30 [0086] FIGS. 24 and 25 are flow diagrams illustrating further operations for presenting a notification, according to some example embodiments.
[0087] FIG. 26 is a flow diagram illustrating communication between various devices in relation to presenting a notification to the user, according to some example embodiments. 12 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [0088] FIG. 27 depicts an example user interface to facilitate the purchase, according to some example embodiments.
[0089] FIGS. 28 and 29 illustrate examples of identifying an item and facilitating a purchase associated with the identified item, according to some 5 example embodiments.
[0090] FIG. 30 is a block diagram illustrating an example embodiment of an augmentation system, according to some example embodiments.
[0091] FIG. 31 is a flow diagram illustrating an example method for authenticating a user and augmenting a user activity, according to some example 10 embodiments.
[0092] FIGS. 32 and 33 are flow diagrams illustrating further example operations of the method of FIG. 31, according to some exampie embodiments.
[0093] FIG. 34 depicts communication between a dev ice of the user arid the data mesh system, according to some example embodiments. 15 [0094] FIGS. 35-38 are flow diagrams illustrating further example operations of the method of FIG. 31, according to some example embodiments.
[0095] FIG. 39 illustrates augmenting an example user activity, according to some example embodiments.
[0096] FIG. 40 depicts an example user interface to facilitate augmentation 20 of a user activity, according to some example embodiments.
[0097] FIG. 41 is a flow diagram illustrating various communications to facilitate the method of FIG. 31, according to some example embodiments.
[0098] FIG. 42 is a block diagram illustrating an example embodiment of a visualization system, according to some example embodiments. 25 [0099] FIG. 43 is a flow diagram illustrating an example method for generating a visualization, according to some example embodiments.
[001Θ0] FIG. 44 is a flow diagram illustrating further example operations of the method of FIG. 43, according to some example embodiments.
[90101] FIG. 45 is a flow diagram illustrating an example method for 30 determining satisfaction of reward criteria, according to some example embodiments.
[00192] FIG. 46 is a flow diagram illustrating further example operations of the method of FIG. 43, according to some example embodiments.
[00103] FIG. 47 is a flow diagram illustrating various communications to 13 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 facilitate the method of FIG. 43, according to some example embodiments. [00104] FIGS. 48, 49, 50A and SOB depict example user interfaces including example visualizations, according to some example embodiments.
[001QS] FIGS. 51A and 51B depict example configurations for 5 communicatively coupling attribute sources, according to some example embodiments, [00106] FIG. 52 depicts various example attribute sources, according to some example embodiments.
[00107] FIG, 53 depicts various components that provide attribute data, 10 according to some example embodiments.
[00108] FIG. 54 is a block diagram of an example data structure for example attribute data associated with a user, according to some example embodiments.
[00109] FIG. 55 is a block diagram of an example data structure for 15 example attribute data associated with a device, according to some example embodiments.
[00110] FIG. 56 is a block diagram illustrating an example of a software architecture that may be installed on a machine, according to some example embodiments. 20 [00111] FIG. 57 illustrates a diagrammatic representation of a machine in the form of a computer system within which a set of instructions may be executed for causing the machine to perform any one or more of the methodologies discussed herein, according to an example embodiment.
25 DETAILED DESCRIPTION
[00112] The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that: embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to 30 provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details.
In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail. 14 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00113] Modem users of technology typically have a variety of devices to perform various tasks such as laptop computers, smart televisions, smart phones, and wearable devices. Traditionally, a user of such devices operates a single device to perform a task at one time. The discussion below describes systems 5 and methods that, in some embodiments, utilize multiple devices to perform, in real time, an ancillary, complementary, or supplemental activity associated with a particular activity or task. In some embodiments, the systems and methods distribute a portion of a particular activity or task across multiple devices. The ancillary or supplemental activity associated with a particular device activity 10 may be determined based on a variety of factors including an inferred user preference, device functionality, and the device activity. Thus, the ancillary activity is dynamically determined and performed for the user in real time to assist the user with the device activity, [00114] In various example embodiments, a device activity being performed 15 in real time by a user device of a user is detected. For example, the user may be browsing a website on a mobile device or a laptop computer. Once the device activity is detected, attribute data associated with the user from a plurality of attribute sources is accessed. In various example embodiments, the attribute data is received or accessed from a broad range of attribute sources such as, for 20 example, from mobile devices, smart devices, smart homes, social network services, user profiles, browsing histories, purchase histories, and so forth. [80115] A user preference is inferred indicating a preference or desire of the user for performing on a secondary user device an ancillary, supplemental, complementary, or companion activity" corresponding to the device activity. For 25 example, an analysis of the attribute data can indicate the user would like to view" supplemental content corresponding to the device activity. In this example, the ancillary activity comprises presenting the supplemental content to the user on the secondary user device.
[88116] Based on the inferred user preference, the secondary user device 30 can be identified according to a device status of the secondary user device, according to some example embodiments. In various implementations, the device status indicates a device capability to perform the ancillary activity in real time. For example, if the ancillary activity comprises presenting content to the user, the secondary user dev ice may be incapable of performing such a task if the 15 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 secondary user device is not within a vicinity' of the user (e.g., within a presentation distance of the user).
[Θ0117] Once the secondary user device is identified, the ancillary activity', to be performed in real time on the secondary user device, is generated by 5 analyzing the device activity, a device functionality of the secondary user device, and the user preference. For example, if the device activity' comprises providing directions to the user, tire ancillary' activity may' comprise providing content associated with providing directions such as a current heading or distance to destination readout. In various implementations, the ancillary' activity is caused 10 to be performed on tire secondary' user device or transmitted to the secondary user device with instruction to perform in real time the ancillary activity'.
[00318] According to further embodiments, the objective of zero effort shopping is to reduce or eliminate effort by a consumer user to purchase various products. To this end, tire systems and methods described herein may, among 15 other functions, access a wealth of attribute data associated with a user, analyze the attribute data to identify items that the user may have demand for, and facilitate a purchase associated with the identified items. For instance, the user may be characterized based on an analysis of the attri bute data, and the user characterization may be used as a basis for identifying items and order 20 parameters for a purchase associated with identified items. The collective, aggregated attribute data may be referred to as a “data mesh.” [80119] In various example embodiments, the attribute data is received or accessed from a broad gamut of attribute sources such as, for example, mobile devices, smart devices, smart homes, social network services, user profiles, 25 browsing histories, purchase histories, and so forth. Demand indications that are indicative of anticipatory demand by the user for a particular item are extracted from the attribute data. For example, purchase histories may indicate prior purchases for coffee products, location data (e.g., as determined by a Global Positioning System (GPS) component of a mobile device, beacon detections, or 30 other location services) may indicate frequent trips to coffee shops, or social media data such as check-ins or user postings may' indicate an affinity for coffee. Subsequent to extracting the demand indications, a commerce item may be identified from the attribute data based on the extracted demand indications. In continuing with the example above, the identified commerce item may comprise 16 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 coffee beans, coffee filters, or other coffee-related items.
[00120] In further example embodiments, user characteristics pertaining to the user are inferred based on an analysis of a portion of the attribute data. The user characteristics include, for instance, a trait, quality, action, acti vity, attitude, 5 health condition, habit, behavior, and the like. For example, the user characteristics may include a particular medical condition of that user associated with diet restrictions. The systems and methods described herein may facilitate a purchase associated with the commerce item based, at least in part, on the user characteristics. In an example embodiment, a notification, including an option to 10 make the purchase, is presented to the user. In some instances, the notification is personalized to the user based on the user characteristics (e.g., the notification may be presen ted on a preferred device of the user at a time of day corresponding to user availability' such as after the user is done working). In some example embodiments, the purchase is made automatical ly on behalf of the 15 user. For instance, a demand metric is calculated based on the demand indications, and if the demand me tric exceeds a threshold, the purchase may be performed automatically. Thus, the systems and methods described herein may facilitate commerce on behalf of the user to increase convenience and reduce time and effort of the user to conduct commerce. In some cases, the systems and 20 methods analyze the attribute data to simulate decisions of the user regarding various shopping or purchasing-related activities.
[00121] According to still further embodiments, attribute data, including real-time data, may be received from a plurality of attribute sources; a user may be authenticated based on an analysis of the attribute data; a user activity may be 25 identified based on the attribute data; and the user activity'· may be augmented according to a user setting. The attribute data may include data received from a broad gamut of attribute sources such as, for example, mobile de vices, smart devices, smart homes, social network sendees, user profiles, browsing histories, purchase histories, and so forth. 30 [00122] Subsequent to receiving the attribute data, a portion of the real-time data indicative of an identity' of the user may be identified. For example, the real-time data may include a location of a mobile device of the user, sensor data from a device of the user, and so on. In an example embodiment, the identity of the user may be authenticated based on an analysis of the identified portion of 17 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the real-time data. For instance, the location of the user’s mobile device may be indicative of the location of the user, and if the location of the user device is within a distance of an Internet device currently being used, it may be inferred that the user may be currently using the Internet device. Many other indications 5 of the user’s identity may be employed to authenticate the identity of the user.
In further example embodiments, an identity likelihood metric may be calculated based on an analysis of the attribute data. In some example embodiments, the identity of the user may be authenticated when the iden tity likelihood metric exceeds a threshold. 10 [00123] In still further example embodiments, based on authenticating the identity of the user, a user activity being performed by the user may be identified based on the real-time data. For example, the user may be using a website, jogging, entering a room (e.g., walking from a living room to a kitchen), and so on. The user activity may be augmented according to the user setting, in a 15 specific example, the user may be using a website that implements a login, and based on the authentication of the user’s identity, the security for the website login may be reduced or the user is automatically logged in. In another specific example, the user may be streaming media in the living room to an Internet-connected device (e.g., a media entertainment system presenting to a display), 20 and the streaming may be continued in the kitchen (e.g., to a smart refrigerator that includes a display) when the user moves from the living room to the kitchen. Many other user activities may be augmented in a variety of ways.
[80124] According to yet further embodiments, a visualization may be generated based, at least in part, on attribute data associated with a user. In an 25 example embodiment, attribute data may be received from a broad gamut of attribute sources. For instance, the attribute data may include data associated with the user received from mobile de vices, smart de v ices, smart homes, social network services, user profiles, browsing histories, or purchase histories. The collective, aggregated attribute data may be referred to as a “data mesh.” 30 Subsequent to receiving the attribute data, a user characteristic may be inferred based on an analysis of at least a portion of the attrib ute data. In various example embodiments, the user characteristic may be a trait, quality, action, activity, attitude, health condition, habit, behavior, and the like. For example, physical characteristics of the user such as height, weight, fitness level, and so 18 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 on, may be inferred or measured directly from the attribute data. In an example embodiment, a visualization may be generated based, at least in part, on the user characteristic. The visualization may be representative of the attribute data. For example, the visualization may be an avatar that includes physical characteristics 5 similar to the user’s. In this example, the visualization may be representative of the user. The visualization may be caused to be presented to the user.
[Θ0125] In further example embodiments, the user may provide user input indicating a change to the visualization. The visualization may be updated according to the change indicated by the user input. Subsequent inferred user 10 characteristics may be based, at least in part, on the user input. For example, if the visualization does not accurately reflect the user or the attribute data, the user may modify the visualization. The modification Μην' then be used as a basis to more accurately generate the visualization or more accurately' infer the user characteristics. 15 [80126] In still further example embodiments, a reward may be provided to the user based on a determined satisfaction of reward criteria. For instance, the reward criteria may include a criterion to complete a physical activity such as a certain number of steps taken as determined by a pedometer (e.g., an application executing on a mobile device of the user that may determine steps taken). Based 20 on exceeding a threshold number of steps taken, the user may' satisfy the reward criteria. In an example embodiment, the reward may include an additional feature for the visualization (e.g., an additional accessory or function for the visualization). In another instance, the reward criteria may be associated with completeness of a profile. In this instance, the more information the user 25 provides or provides permission to access, the closer the user may be to satisfying the reward criteria.
[88127] With reference to FIG. 1, an example embodiment of a high- level client-server-based network architecture 100 is shown. A networked system 102 provides server-side functionality via a network 104 (e.g., the Internet or wide 30 area network (WAN)) to a client device 110. A user (e.g., user 106) may interact with the networked system 102 using the client device 110. FIG. 1 illustrates, for example, a web client 112 (e.g., a browser, such as the INTERNET EXPLORER:® browser developed by MICROSOFT®1 Corporation of Redmond, Washington State), client application(s) 114, and a programmatic client 116 19 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 executing on the diem device 110. The client device 110 can include the web client 112, the client appiication(s) 114, and the programmatic client 116 alone, together, or in any suitable combination. Although FIG. 1 shows one client device 110, multiple client devices can be included in the network architecture 5 100.
[0(5128] The client device 110 can comprise a computing device that includes at least a display and communication capabilities that provide access to the networked system 102 via the network 104. The client device 110 comprises, but is not limited to, a remote device, work station, computer, general 10 purpose computer, Internet appliance, hand-held device, wireless device, portable device, wearable computer, cellular or mobile phone, personal digital assistant (PDA), smart phone, tablet, ultrabook, netbook, laptop, desktop, multiprocessor system, microprocessor-based or programmable consumer electronic, game consoles, set-top box, network PC, mini-computer, and the like. In further 15 example embodiments, the client device 110 comprises one or more of a touch screen, accelerometer, gyroscope, biometric sensor, camera, microphone, global positioning system (GPS) device, and the like.
[80129] The client device 110 can communicate with the network 104 via a wired or wireless connection. For example, one or more portions of the network 20 104 can be an ad hoc network, an intranet, an extranet, a Virtual Private Network (VPN), a Local Area Network (LAN), a wireless LAN (WLAN), a Wide Area Network (WAN), a wireless WAN (WWAN), a Metropolitan Area Network (MAN), a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a cellular telephone network, a wireless network, a Wireless 25 Fidelity (Wi-Fi®) network, a Worldwide interoperability for Microwave Access (WiMax) network, another type of network, or a combination of two or more such networks.
[00130] The client device 110 can include one or more of the applications (also referred to as “apps”) such as, but not limited to, web browsers, book 30 reader apps (operable to read e-books), media apps (operable to present various media forms including audio and video), fitness apps, biometric monitoring apps, messaging apps, electronic mail (email) apps, e-commerce site apps (also referred to as “marketplace apps”), and so on. The client application^) 114 can include various components operable to present information to the user and 20 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 communicate with the networked system 102. In some embodiments, if the e-commerce site application is included in the client device 110, then this application can be configured to locally provide the user interface and at least some of the functionalities with the application configured to communicate with 5 the networked system 102, on an as-needed basis, for data or processing capabilities not locally available (e.g., access to a database of items available for sale, to authenticate a user, to verify a method of payment). Conversely, if the e-eommeree site application is not included in the client device 310, the client device 110 can use its web browser to access the e-commerce site (or a variant 10 thereof) hosted on the networked system 102.
[00131] In various example embodiments, the users (e.g., the user 106) can be a person, a machine, or other means of interacting with the client device 110. In some example embodiments, the users may not be part of the network architecture 100, but may interact with the network architecture 100 via the 15 client device 110 or another means. For instance, the users can interact with client device 110 operable to receive input information from (e.g., using touch screen input or alphanumeric input) and present information to (e.g., using graphical presentation on a device display) the users. In this instance, the users may, for example, provide input information to the client device i 10 to be 20 communicated to the networked system 102 via the network 104, The networked system 102, in response to the received input information, communicates information to the client device 110 via the network 104 to be presented to the users. In this way, the user may interact with the networked system 102 using the client device 110. 25 [00132] An Application Program Interface (API) server 120 and a web server 122 may be coupled to, and provide programmatic and web interfaces respectively to, one or more application server(s) 140. In various implementations, the application server(s) 140 hosts one or more publication system(s) 142, payment system(s) 144, and a data mesh system 150, each of 30 which comprises one or more modules or applications and each of which is embodied as hardware, software, firmware, or any combination thereof. The application server(s) 140 are, in turn, shown to be coupled to one or more database server)s) 124 that facilitate access to one or more information storage repositories or database(s) 126. In an example embodiment, the datafaase(s) 126 21 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 are storage devices that store information to be posted (e.g., publications or listings) to the publication system(s) 142, The database(s) 126 stores digital goods information in accordance with some example embodiments.
[00133] Additionally, a third party application 132, executing on third party 5 server(s) 130, is shown as having programmatic access to the networked system 102 via the programmatic interface provided by the API server 120. For example, the third party application 132, utilizing information, retrieved from the networked system 102, supports one or more features or functions on a website hosted by the third patty. The third party website, for example, provides one or 10 more promotional, marketplace, or payment functions that are supported by the relevant applications of the networked system 102.
[00134] The publication system! s) 142 provides a number of publication functions and services to the users that access the networked system 102. The payment system(s) 144 likewise provide a number of functions to perform or 15 facilitate payments and transactions. While the publication system(s) 142 and payment system(s) 144 are shown in FIG. 1 to both form part of the networked system 102, it will, be appreciated that, in alternative embodiments, each system 142 and .144 may form part of a payment service that is separate and distinct from the networked system 102. In some example embodiments, the payment 20 system(s) 144 may form part of the publication system(s) 142.
[001351 The data mesh system 150 provides functionality to receive, retrieve, or store a broad spectrum of data associated with the users, according to various embodiments. It will be noted that the collective, aggregated attribute data is sometimes referred to as a. “data mesh.” The data mesh system 150 25 stores, for example, received data in storage devices such as the database(s) 126.
In some example embodiments, the data mesh system 150 communicates with the client device 110, the third party server(s) 130, the publication system(s) 142 (e.g., retrieving listings), and the payment system(s) 144 (e.g., purchasing a listing). In an alternative example embodiment, the data mesh system 150 can 30 be a part of the publication system(s) 142.
[00136] Further, while the client-server- based network architecture 100 shown in FIG, 1 employs a client-server architecture, the present inventive subject matter is, of course, not limited to such an architecture, and can equally well find application in a distributed, or peer-to-peer, architecture system, for 22 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 example. The various systems of the applications server(s) 140 (e.g., the publication sysiem(s) 142 and the payment system(s) 144) can also be implemented as standalone software programs, which do not necessarily have networking capabilities. 5 [8(5137] The web client 112 can access the various systems of the networked system 102 (e.g., the publication system(s) 142) via the web interface supported by the web server 122. Similarly, the programmatic client 116 and client application(s) 114 can access the various services and functions provided by the networked system 102 via the programmatic interface provided by the API 10 server 120. The programmatic client 116 can, for example, be a seller application (e.g., the Turbo Lister application developed by EBAY© Inc., of San Jose, California) to enable sellers to author and manage listings on the networked system 102 in an off-line manner, and to perform batch-mode communications between the programmatic client 116 and the networked system 102. 15 [00138] FIG. 2 is a block diagram of the data mesh system 150, which can provide functionality to receive, retrieve, or access attribute data from attribute sources, analyze the attribute data, manage the attribute data, and so forth. In an example embodiment, the data mesh system 150 can include a presentation module 210, a communication module 220, an attribute module 230, a 20 characteristic module 240, a management module 250, an ancillary activity system 260, a user analytics system 270, an augmentation system 280, and a visualization system 290. All, or some, of the modules 210-290 of FIG. 2, can communicate with each other, for example, via a network coupling, shared memory, and the like. It will be appreciated that each module of modules 210-25 290 can be implemented as a single module, combined into other modules, or further subdivided into multiple modules. Other modules not pertinent to example embodiments can also be included, but are not shown.
[00139] Referring to FIG. 2, the presentation module 210 provides various presentation and user interface functionality operable to interactively present and 30 receive information from users. For example, the presentation module 210 causes presentation of various notifications or user interfaces that provide the user an option to make a purchase associated with the identified items. The presentation module 210 presents or causes presentation of information using a variety of means, including visually displaying information and using other 23 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 device outputs (e.g., acoustic, haptic). Interactively presenting is intended to include the exchange of information between a device and a user. The user can provide input to interact with the user interface in a variety of ways, including alphanumeric input, cursor input, tactile input, or other input (e.g., one or more 5 of touch screen, camera, tactile sensors, light sensors, infrared sensors, biometric sensors, microphone, gyroscope, accelerometer, or other sensors). It will be appreciated that the presentation module 210 provides many other user interfaces to facilitate functionality described herein. Further, it will be appreciated that “presenting” as used herein is intended to include communicating information to 10 another device with functionality operable to perform presentation using the communicated information.
[00140] The communication module 220 provides various communications functionality and web services. For example, the communication module 220 provides network communication such as communicating with the networked 15 system 102, the client device 110, and the third party server(s) 130. In various example embodiments, the network communication operates over wired or wireless modalities. Web services are intended to include retrieving information from the third party server(s) 130, the database(s) 126, and the application server(s) 140. Information retrieved by the communication module 220 20 comprises data associated with the user (e.g., user profile information from an online account, social network service data associated with the user), data associated with one or more items listed on an e-commerce website (e.g., images of the item, reviews of the item, item price), other data to facilitate the functionality described herein, and so on. 25 [00141] The attribute module 230 can receive, access, or retrieve a wide variety of attribute data from many different attribute sources. For example, the attribute module 230 receives, retrieves, or accesses the attribute data from user devices or machines (e.g., the client device 110), social network services, the third party server(s) 130, the publication system(s) 142, the payment system(s) 30 144, other applications servers, or other attribute sources. The attribute data, as used herein, is intended to include raw data such as sensor data, profile data, social network content, and so on.
[80142] In some example embodiments, the attribute module 230 extracts the attribute data from various sources. For instance, a payment history log of 24 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the user can include a tremendous amount of extraneous data. The attribute module 230 can extract purchase information such as item purchased, time, purchase price, seller, location, brand, and so forth from the payment history log of the user. 5 [8(5143] In further example embodiments, the attribute module 230 performs various functions to prepare or condition the attribute data for analysis. For instance, the attribute module 230 standardizes the attribute data to facilitate analysis of the attribute data (e.g., determine a normal form for the data to allow for comparison and other mathematical analysis). The attribute module 230 10 performs many other functions to prepare the attribute data for analysis.
[88144] In various example embodiments, the attribute module 230 stores the attribute data in association with the user for subsequent analysis. For example, the attribute module 230 stores the attribute data in the daiabase(s) 126. The attribute data can be stored in conjunction with a user identifier such that the 15 attribute module 230 subsequently uses the user identifier to access the attribute data corresponding to a particular user. The attribute module 230 accesses the stored attribute data using other schemes. For instance, the attribute module 230 accesses a portion of the attribute data associated with a time, an item, a user, a type of user, a particular attribute source, and so forth. In this way, the attribute 20 module 230 accesses a portion of attribute data according to various parameters from among a large quantity of the attribute data to access, identity, or find pertinent and relevant data.
[80145] The characteristic module 240 infers a user characteristic or multiple user characteristics corresponding to the user based on an analysis of at 25 least a portion of the attribute data. Many schemes and techniques can be employed to infer the characteristics from the attribute data. For example, a particular user characteristic can be a work location of the user. The attribute data can include a plurality of locations (e.g., as determined by a GPS component of a user device used by the user) that include time stamps. The 30 work location of the user can be inferred based on the consistency and timing of the locations included in the attribute data (e.g., during normal working hours, the user is typically at a particular office building). Many different portions of attribute data and combinations of portions of attribute data can be analyzed to infer a wide variety of characteristics. 25 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [80146] In various example embodiments, characteristics (e.g., the user characteristics), as used herein, are intended to include traits, qualities, actions, activities, attitudes, habits, behaviors, and the like pertaining to a person or people. Inasmuch as the attribute data may not necessarily pertain to a person 5 (e.g., raw data such as coordinates of a particular location), a characteristic (e.g., current location of the user, disliking spicy food, having young children, being a Star Trek fanatic) can be distinct from the attribute data.
[88147] The management module 250 provides management functions associated with the attribute data. For example, the management module 250 10 provides the user with functionality to edit, modify, update, or otherwise control the attribute data. For instance, the user removes undesired attribute data via the functionality provided by the management module 250. In a further instance, the user specifies permissions for portions of the attribute data using the functionality provided by the management module 250. The permissions allow 15 or prohibit certain access or uses for the attribute data (e.g., the permission prohibits access to the attribute data by third parties). Various levels of access and abilities can be granted. In some example embodiments, the permissions persist for a period of time, and after expiration of the time period, the permissions are revoked. 20 [00148] In further example embodiments, the management module 250 requests consent from the user to access portions of the attribute data or to request permission for certain uses of the attribute data. For example, the management module 250 requests consent from the user to allow third parties to access portions of the attribute data. The management module 250 requests a 25 variety of other consents associated with various actions corresponding to the attribute data.
[00149] In still further example embodiments, the management module 250 provides functionality' to allow' third parties to access the attribute data or user characteristics. For example, the management module 250 provides a set of 30 APIs that can be invoked by third parties to access the attribute data or user characteristics. As discussed above, in some example embodiments, permission or consent of the user is determined prior to providing access to the attribute data.
[0Θ150] FIG. 3 is a block diagram of she ancillary activity system 260, 26 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 which can provide functionality to generate ancillary activities based on various triggers and information. The ancillary activity system 260 includes an activity module 310, a preference module 320, and a device module 330, according to an example embodiment, 5 [00151] The activity module 310 in the ancillary activity system 260 provides functionality to generate the ancillary, supplemental, complementary, or companion activity corresponding to the device activity. For example, the user can be browsing a website and the activity module 310 generates the ancillary activity comprising providing the user interactive content such as an 10 option to share, like, post (e.g., tweet), and so forth the website. The activity module 310 generates many other ancillary activities corresponding to the device activity based on various triggers and information.
[00152] The preference module 320 provides functionality to infer a user preference, from the attribute data, indicating a preference of the user associated 15 with performing on the secondary user device the ancillary activity corresponding to the device activity. For example, the preference module 320 infers that the user desires or prefers a particular type of content associated with a particular device activity. The preference module 320 also identifies other users that are similar to the user and infer the user preference based on the 20 identified similar users. The preference module 320 employs a variety of schemes and techniques using a wide range of data to infer the user preference.
[00153] The device module 330 provides functionality to identify the secondary user device according to a device status of the secondary user device. In various implementations, the device status indicates a device capability to 25 perform the ancillary activity in real-time. The device module 330 retrieves, derives, determines, or otherwise obtains a variety of information associated with the user device and the secondary user device to facilitate the functionality herein. For example, the device module 330 determines available functionality of the secondary user device. 30 [00154] FIG. 4 illustrates a diagram 400 showing an example of generating ancillary activity for a secondary user device, according to some example embodiments. User 410 may be using wearable computing devices such as wearable device 420 (e.g., GOOGLE GLASS®) or wearable device 430 (e.g., a smart watch). In this example, the wearable device 430 is communicatively 27 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 coupled to user device 450 via wireless signals such as signal 440. In various implementations, the user device 450 is communicatively coupled, via coupling 460, to the network 104, which is in turn communicatively coupled to the networked system 102 including the data mesh system 1.50 (discussed above in 5 connection with FIG. 2) and the ancillary activity system 260 (discussed above in connection with FIG. 3).
[00155] The user 410 may he operating or using the user device 450. The terms “operating;5 “using,” “in service,” or “in use” as used herein are intended to include a particular user physically interacting with a particular device, being 10 capable of operating the particular device within a short time period such as dormant device or a device in standby (e.g., a particular user carrying a mobile device on their person without presently physically interacting with the device is included in the terms “operating,” “using ” or “in use”), or otherwise utilizing the particular device (e.g., a smart refrigerator, that is not within a vicinity' of the 15 user, configured to track inventory levels and provide inventory data), [80156] In the example diagram 400, the user 410 is carrying the user device 450 that is communicatively coupled to the ancillary activity system 260. The activity module 310 detects the device activity of the user device 450. For instance, the user 410 can be browsing a web page, receiving directions to a 20 particular location, monitoring fitness activity such as a number of steps taken, and so forth using the user device 450.
[00157] Once the activity module 310 detects the device activity, the preference module 320 infers the user preference from the attribute data accessed by the attribute module 230. T he user preference indicates a preference of the 25 user or desire of the user for performing, on a particular user device, a particular ancillary activity corresponding to the device activity. For example, the preference module 320 infers that the user 410 would like an option, presented on a particular wearable device, to favorite, share, post (e.g., tweet) associated with a webpage the user 410 is browsing on the user device 450. 30 [00158] Based on the inferred user preference, the device module 330 identifies the secondary user device according to the device status of the secondary user device. The device status indicates that the device is capable of performing the ancillary activity. For example, the device module 330 identifies the wearable device 430 as the secondary user device based on the device status 28 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 corresponding to the wearable device 430. In this example, the device module 330 determines the de vice status of the wearable device 430 as being in use by the user 410 (e.g., the user 410 wearing the wearable device 430). Thus, the wearable device 430 is operable to perform the ancillary activity since providing 5 the option for the user 410 to select can be effectuated by the wearable device 430 as the user 410 is within an operating distance of the wearable device 430.
[00159] After the device module 330 identifies the secondary user device, the activity module 310 generates the ancillary activity to be performed in real time on the secondary' user device. The activity module 310 generates the 10 ancillary activity7 by analyzing the device activity7, a device functionality of the secondary user device, the user preference, and other data. The device functionality indicates options available for input and output on the secondary user device. For example, the identified secondary user device is the wearable device 430 that has a small screen area to display user interfaces or has reduced 15 options for output (e.g., no speakers). In a specific instance, the activity module 310 generates the ancillary7 activity to include abridged activity content according to the device functionality (e.g., small display size). After the activity module 310 generates the ancillary activity, the activity module 310 transmits or otherwise communicates to the secondary user device to perform the ancillary 20 activity in real time.
[00160] FIG. 5 is a flow diagram illustrating an example method 500 for generating ancillary activity for the secondary user device, according to some example embodiments. At operation 510, the activity module 310 detects the device activity7 being performed in real time by7 the user device of a user. The 25 term “real-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the device activity being performed in real time includes a particular device acti vity detected at the acti vity module 310 after a delay interval (e.g., due to transmission delay7 or other delays such as being temporarily stored at an intermediate device) between the instant of the 30 particular device activity occurring and the activity module 310 detecting the particular device activity. Thus, in some instances, the device activity being performed in real time is intended to include activities that have occurred a short time in the past. This discussion of real time applies equally throughout the specification in relation to other uses of the term “real time.” 29 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00161] In various embodiments, the device activity includes a wide variety of activities such as browsing a webpage, monitoring fitness activity (e.g., steps the user has taken), heart rate monitoring, inventory level monitoring (e.g., a smart refrigerator that monitors inventory), and so on. In some implementations, 5 the activity module 310 detects the device activity being performed by monitoring devices of the user. For instance, a smart appliance of the user provides a continuous or periodic stream of data indicating various device activities.
[00162] At operation 520, the attribute module 230 accesses the attribute 10 data associated with the user from the plurality of attribute sources. In various example embodiments, at least a portion of the attribute data includes real-time data or near real-time data. For example, the real-time data includes user input data or sensor data communicated to the attribute module 230 after a delay interval (e.g., due to transmission delay or other delays such as being temporarily 15 stored at an intermediate device) between capturing the data and the attribute module 230 receiving the data.
[00163] As will be discussed in connection with FIGS. 52 and 53, the attribute data is received from a broad spectrum of attribute sources (e.g., devices, sensors, servers, databases, and other sources). Additionally, the 20 attribute module 230 receives or accesses the attribute data via many pathways resulting from an assortment of configurations of the attribute sources as further discussed in connection with FIGS. 51A and 51B. In an example embodiment, the attribute module 230 receives the attribute data directly from the attribute sources. In other example embodiments, the attribute module 230 receives the 25 attribute data from a central device that receives attribute data from a plurality of user devices. In still other example embodiments, various user devices are communicatively coupled in a decentralized device-to-device mesh, and the attribute module 230 receives the attribute data corresponding to a particular device in the mesh from any of the devices in the mesh. The attribute module 30 230 receives the attribute data from the attribute sources in many other configurations including various suitable combinations of configurations.
[00164] In various example embodiments, the attribute module 230 stores the attribute data in association with the user (e.g., indexed based on a user identifier) for subsequent analysis. The attribute module 230 stores the attribute 30 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 data in a storage device such as the database(s) {26, for example. The attribute module 230 accesses the stored attribute data using a variety of search or find schemes. For instance, the attribute data associated with a particular user is accessed using a user identifier that corresponds to the particular user. It will be 5 noted that the collective, aggregated attribute data is sometimes referred to as a “data mesh.” [00165'j At operation 530, the preference module 320 infers the user preference or desired user setting, from the attribute data, indicating a preference of the user for performing on the secondary user device the ancillary activity 10 corresponding to the device activity. For example, the attribute data includes engagement data of the user (e.g., particular websites the user visits, taps, clicks, or other interactions with various notifications) indicative of the types of information the user is interested in. in a specific example, the preference module 320 infers the user preference for receiving notifications associated with 15 a particular sporting event based on the engagement data.
[8(1166] In this specific example, the activity module 310 detects from the user device (e.g., a smart television) that the device activity includes watching a sporting event. In continuing with this example, the preference module 320 infers the user preference to indicate the user has an affinity for notification 20 associated with watching a sporting event on a particular secondary device based on past notifications the user showed interest in that are included in the attribute data.
[88167] Referring now' to FIG. 6, a flow diagram is shown illustrating further operations for inferring a user preference from the attribute data, 25 according to some example embodiments. As discussed above, subsequent to the operation 520, at the operation 530, the preference module 320 infers the user preference from the attribute data.
[80168] At operation 610, the characteristic module 240 infers or directly measures user characteristics pertaining to the user from the attribute data. In 30 some example embodiments, the characteristic module 240 stores the inferred user characteristics for subsequent analysis, for example, in a storage device such as database(s) 126. The characteristic module 240 infers a vast spectrum of the user characteristics from the attribute data. A few specific examples of user characteristics include demographic data (e.g., age, gender, marital status, 31 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 number of children), user preferences (e.g., being a morning person, favorite locations, enjoying spicy food), idiosyncrasy (e.g., being forgetful, such as draining the battery on a mobile device; or being impatient, such as a line breaker that will leave a store if the line is too long), qualities (e.g., being 5 athletic, being tall, having a large vocabulary), personality traits (e.g., being a risk taker), actions, activities (e.g., working for a non-profit), attitudes, habits (e.g., being a coffee drinker), behaviors, beliefs, biases, demeanor, and physical characteristics of the user (e.g., height, weight, garment sizes, eye color, hair color). The specificity of the characteristics ranges from very narrow (e.g., 10 drinks a particular brand of soda) to very? broad (e.g., being generally philanthropic). To illustrate inferring the user characteristic from the attribute data in an example, the attribute data includes user location data that indicates frequent visits to a local school, local soccer fields, and the like. In this example, the characteristic module 240 infers that the user has children based on the types 15 of locations the user may be frequently visiting.
[00169] In some instances, the characteristic module 240 performs varying degrees of inferential analysis of the attribute data to derive the user characteristics. For example, the characteristic module 240 infers the user’s wake-up time based on user device activity or other activity (e.g., connected 20 alarm clock settings, logins to accounts, and various other user activities that indicate a wake-up time). In this example, the characteristic module 240 infers a particular user characteristic that can be of a larger inferential jump such as the user being a morning person or liking to sleep in. The degree of inferential jump can be configurable. In some example embodiments, the characteristic module 25 240 employs various techniques to minimize or otherwise control incorrect inferences (e.g., machine-learning, other learning algorithms).
[00170] In further example embodiments, the characteristic module 240 learns, adapts, or evolves as more of the attribute data is received (e.g., via machine {earning techniques or other learning algorithms). For example, the 30 attribute data includes location data of the user. The characteristic module 240 infers a favorite location of the user based on a pattern (e.g., frequently? visited locations) in the location data. However, the characteristic module 240 subsequently receives employment data of the user that indicates a current employer including an employer location. The characteristic module 240 learns, 32 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 updates, or otherwise adapts to account for the new attribute data. Thus, in this example, the characteristic module 240 may not infer a favorite location of the user if the location is a work location of the user. In some instance, the user may provide input directly (e.g., via a user interface configured to receive inferential 5 guidance from the user) to facilitate the characteristic module 240 in inferring characteristics from the attribute data (e.g., user input indicating that a particular inferred characteristic is incorrect or providing input to be used as a basis for future inferences).
[00171] In other instances, the characteristic module 240 performs very 10 little or no analysis to derive the user characteristic from the attribute data. For example, the attribute data includes an alarm time setting from a connected alarm clock (e.g., a smart phone with an alarm clock app). The alarm time setting can directly indicate a wake-up time. Since the attribute data directly relates to a particular user characteristic, the characteristic module 240 need not 15 perform analysis to derive the user characteristic.
[00172] In some example embodiments, the user characteristic comprises predefined characteristics or dynamically determined characteristics. For instance, a particular set of characteristics is predefined (e.g., work location, home location, marital status, socio-economic level). In this instance, the 20 characteristic module 240 determines that particular predefined characteristics are associated with the user based on an analysis of the attribute data. In other instances, the characteristic module 240 dynamically determines characteristics based on the attribute data. For example, the attribute data indicates that the user owns a particular exotic pet. Although there may not be a predefined 25 characteristic associated with the particular exotic pet, the characteristic module 240 determines the user characteristic of owning an exotic pet from the attribute data.
[00173] At operation 620, the preference module 320 identifies similar users that are similar to the user based on the inferred user characteristics and 30 respective user characteristics of a plurality of other users. The preference module 320 identifies similar users that are similar to the user based on a variety of factors. In some example embodiments, the preference module 320 accesses the attribute data or stored user characteristics corresponding to the plurality of other users. For example, the preference module 320 identities the similar users 33 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 from among the plurality of other users that are similar to the user based on the inferred user characteristics of the user and respective user characteristics of the plurality of other users. The preference module 320 correlates, matches, or otherwise compares the inferred user characteristics with respective user 5 characteristics of the plurality of other users to identify the similar users. In various example embodiments, the preference module 320 identities the similar users based on same or similar demographic data (e.g., same or similar age, marital status, gender, geographic location, etc.), same or similar user characteristics (e.g., same or similar brand purchases), same or similar attribute 10 data, and so on.
[00174] At operation 630, the preference module 320 infers the user preference or desired user setting based on the identified similar users. For example, the preference module 320 analyzes the user characteristics of the identified similar users to determine the user preference. In a specific example, 15 if the user characteri stics of the identified similar users indicate a preference for viewing particular content on a particular secondary user device, the preference module 320 infers that the user also has the same or a similar preference for viewing the particular content on the particular secondary' user device, in this way, the preference module 320 infers the user preference based on the 20 identified similar users.
[00175] Referring back to FIG. 5, at operation 540, the device module 330 identifies the secondary user device according to the device status of the secondary user device. The device status indicates a device capability? to perfonn the ancillary activity' in real time. The device capability' includes 25 various metrics and features associated with the secondary' user device. In various implementations, the device module 330 identifies the secondary user device based on the device status that includes various combinations of factors discussed below. For example, the device status can be based on the device capability including a distance of the secondary user device to the user as further 30 discussed in connection with FIG. 7, below. For instance, the secondary user de v ice is capable of presenting information or notification to the user if the secondary user device is within a vicinity of the user.
[00176] In another example, the device status is based on a particular functionality of the device. The device module 330 queries the secondary' user 34 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 device to determine the functionality of the secondary user device. In other instances, the device module 330 determines the functionality of the secondary user device via a lookup of functionality for the same or similar device as the secondary user device. For instance, if the secondary user devic e is a particular 5 type of wearable device, the device module 330 can look up available functions for the particular type of wearable device from the third party servers(s) 130.
For instance, if the secondary user device has an audio output, then the device module 330 identifies the secondary user device as capable of performing a particular ancillary' activity that includes audio output (e.g., voice directions for 10 hands-free navigation). Thus, in some cases, the device module 330 determines the device status in conjunction with a particular ancillary activity'.
[003 77] In still another example, the device status is based on whether the secondary user device is active as further discussed in connection with FIG. 8, below. For example, if the secondary' user device is not currently being used 15 (e.g., as determined via a sensor on the device such as an accelerometer that indicates the device is perfectly still), then the device module 330 determines that the device is not active and does not identify the inactive secondary' user device as being capable of performing the ancillary activity'.
[80178] Referring now to FIG. 7, a flow' diagram illustrates further 20 operations for identifying a secondary user de vice, according to some example embodiments. Specifically, FIG. 7 is directed to the device module 330 identifying the secondary user device by determining that the device status of the secondary user device includes the secondary user being within a distance of the user. Subsequent to the operation 530, the device module 330 receives sensor 25 data from the secondary user device at operation 710. The sensor data includes, for example, data received from any of the sensors discussed in connection with FIG. 53, below. In a specific example, the sensor data includes location data as determined by a GPS component of the secondary user device. The sensor data represents a real-time physical environment of the secondary' user device. As 30 discussed below', the device module 330 determines the device status of the secondary user device based on the sensor data received from the secondary user device.
[00179] At operation 720, the characteristic module 240 or the device module 330 infers the current user location based on location data received from 35 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the user device. For example, the attribute data includes real-time data, from the user device, corresponding to the location of the user (e.g., location data as determined by a GPS component of a mobile device corresponding to the user). [00180] At operation 730, the device module 330 extracts a current device 5 location from the sensor data received from the secondary user device. For instance, the secondary user de vice can be equipped with a GPS component that provides location data. In other instances, the device module 330 extracts the current device location based on WI-FI® triangulation, NFC beacon detection, or other location services. 10 [00181] At operation 740, the device module 330 compares the current user location and the current device location to determine that the secondary user device is within a distance (e.g., an operating distance) of the current user location. The distance may be a short distance such as a reasonable distance to allow the user to physically operate the secondary user device (e.g., an arm’s 15 length).
[00182] Although FIG. 7 is directed to determining that the secondary user device is within a distance of the user location based on an inferred location of the user and the secondary user device, the device module 330 employs other schemes to determine whether the user is within an operating distance of the 20 secondary user de vice. For instance, if the secondary user device is equipped with biometric identification sensors and the device module 330 receives biometric sensor data from the secondary user device indicating the identity of the user, the device module 330 can, in that situation, infer that the user is within an operating distance of the secondary user device. 25 [Θ0183] Referring now to FIG. 8, a flow diagram illustrates further operations for identifying the secondary user device, according to some example embodiments. Subsequent to the operation 530, the device module 330 receives sensor data from the secondary user device at operation 810, similar to the operation 710 described above. The sensor data represents a real-time physical 30 environment of the secondary user device. For instance, the sensor data includes thermal data (e.g., data that indicates a current temperalure), motion data (e.g., as determined by an accelerometer component), position data (e.g., as determined by a GPS component), biometric data (e.g., heart rate data or fingerprint identifications), communication data (e.g., NFC beacon detections or 36 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 BLUETOOTH® device detections), and other sensor data (see FIG. 53 below for additional sensors and data).
[00184] At operation 820, the device module 330 calculates an active metric from the sensor data, in various implementations, the active metric indicates the 5 secondary user device is in use by the user. In some implementations, the active metric comprises a probability or likelihood that the secondary user device is active. In these implementations, a higher value for the active metric is associated with a higher probability that the secondary user device is active. In a specific instance, if the secondary user device is a wearable device such as a 10 smart watch, the sensor data indicates that a particular user is wearing the wearable device based on a heart rate sensor (when the user is not -wearing the wearable device the heart rate sensor indicates no heart rate).
[00185] Similarly, if the secondary user device is a smart phone that provides ambient temperature data, the device module 330 calculates the active 15 metric based on the temperature data. For example, if the temperature data is fluctuating or is above an expected ambient temperature, this indicates that the user is carrying the smart phone in their pocket. Conversely, if the temperature is near expected ambient temperature and there is little fluctuation in the temperature, the device module 330 calculates a low' probability, corresponding 20 to a lower active metric, indicating that the particular device is not active or in use by the user.
[00186] At operation 830, the device module 330 determines that the secondary user device is active based on the active metric. For example, if the calculated active metric exceeds a threshold value, the device module 330 25 determines that the secondary' user device is active. The threshold value can be predetermined or dynamically' determined by the device module 330. For example, the device module 330 employs various statistical models based on historical values of the active metric to determine whether the current active metric is anomalous. In a simple non-limiting example, the device module 330 30 determines the threshold value to be an average of historical values for the active metric, and the device module 330 determines that the secondary' user device is active if the active metric exceeds the average of the historical values of the active metric. The device module 330 employs many other schemes and techniques to determine that the device is active based on the active metric. 37 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [80187] At operation 840, the device module 330 identifies the secondary user device as being capable of performing the ancillary activity in real time based on the secondary user device being active. That is to say, the device module 330 identifies a particular secondary user device based on the particular 5 secondary user device being active. The reasoning being that if the secondary user device is inac tive, the secondary user de vice is not capable of presenting information to or receiving input from the user and therefore is not capable of performing the ancillary activity, in some cases.
[88188] Referring again to FIG, 5, at operation 550, the activity module 310 10 generates the ancillary activity, to be performed in real time on the secondary user device, by analyzing the device activity, a device functionality of the secondary user device, the user preference, and other factors and data. The ancillary activity can include a wide range of tasks, content, and functionality. For example, the ancillary activity comprises a notification that includes 15 notification content associated with the device activity. For instance, if the device activity is viewing a sporting event on a smart TV, the ancillary activity comprises a notification including notification content associated with the sporting event (e.g., information about the players, the current score, sporting statistics). 20 [8(5189] In some implementations, the ancillary activity includes a portion of the device activity. For example, if the device activity comprises browsing a particular website, the activity module 310 generates the ancillary activity to include content that is a portion of the website. Thus, in these implementations, the activity module 310 is distributing a portion of the device activity to the 25 secondary user device.
[88198] FIG. 9 is a flow diagram illustrating further operations for generating the ancillary activity of the secondary user device, according to some example embodiments. Subsequent to the operation 540, at the operation 550, the activity module 310 generates the ancillary activity. The operation 550 can 30 further include operations 910, 920, and 930.
[88191] At operation 910, the device module 330 determines a display size corresponding to the secondary user device. For example, the device module 330 queries the secondary user device directly to retrieve data associated with the display size. In another example, the device module 330 queries a particular 38 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 third party server (e.g., the third party server(s) 130) using an identifier associated with the secondary user device to determine the display size of the secondary' user device. In a specific example, the device module 330 determines a particular device model for the secondary user device and performs a lookup of 5 the display size for the particular device model.
[00192] At operation 920, the device module 330 determines the display size corresponding to the secondary user device is below a threshold size, For example, the display size may be too small for particular content of the ancillary activity. In this situation, the device module 330 determines that the display size 10 is below the threshold size.
[00193] At operation 930, the activity module 310 generates the ancillary activity to include abridged activity context according to the display size corresponding to the secondary user device. For instance, the device module 330 shortens or reduces content or functionality of the ancillary activity to 15 accommodate the display size of the secondary user device.
[00194] FIG. 10 is a flow diagram illustrating further operations for genera ting the ancillary activity of the secondary' user device, according to some example embodiments. Subsequent to the operation 540, at the operation 550, the activity module 310 generates the ancillary activity. The operation 550 20 further includes operations 1010 and 1020, [00195] At operation 1010, the device module 330 compares the device functionality of the secondary user device with a device functionality of the user device to identify non -mutual functionality of the secondary user device that is not available on the user device. For example, the user device comprises a 25 mobile computer that the user is browsing a website on. The secondary user device can be a wearable device affixed to the user. In this example, the device module 330 identifies haptic output, for instance, of she secondary user device as being non-mutual functionality as the mobile computer may not have haptic output functionality. In another example, a particular wearable device may not 30 include a GPS component while a smart phone device of the user may include a GPS component to determine current location. In this example, the device module 330 identifies use of the GPS component as non-mutual functionality. [0Θ196] At operation 1020, the activity module 310 generates the ancillary activity to include an activity component that utilizes the non-mutual 39 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 functionality of the secondary user device. In the examples discussed above, if the device module 330 determines that the secondary user device includes a GPS component while the user device does not include a GPS component, the activity module 310 utilizes location data from the secondary user device in generating 5 the ancillary activity. In a specific example, if the user is browsing a particular webpage associated with a particular merchant or location, the activity module 310 generates the ancillary activity to include mapping directions to the particular location using the GPS component of the secondary user device to access the user’s current location. Thus, the activity module 310 receives the 10 particular location from the user device and communicates, to the secondary user device, the location and instructions to map directions to that location from the current location.
[00197] Turning again to FIG, 5, at operation 560, the activity module 310 transmits or otherwise communicates, to the secondary' user device, instructions 15 to perform in real time the complementary activity. For example, the activity module 310 communicates instructions to the secondary user device to present a user interface that facilitates or effectuates the ancillary activity (e.g., a user interface including functionality to carry out the ancillary activity). In a specific example, if the ancillary activity comprises a notification including notification 20 content, the activity module 310 transmits the notification including the notification content to the secondary user device to be presented to the user. Thus, in this example, the activity module 310 causes presentation of the notification on the secondary user device. |00198] Although the operations 540 - 560 are directed to performing a 25 single ancillary activity on a single secondary user device, other embodiments include identifying multiple secondary devices and generating multiple ancillary act ivities to be performed in real time on one or more of the identified secondary user devices. For example, if the user is carrying a smart watch and wearing smart glasses, the generated ancillary activities can be distributed among the 30 identified ancillary' activities. For instance, haptic-based notifications can be directed to the user’s smart watch while visual notification is directed to the user’s smart glasses. In this instance, the activity module 310 distributes the multiple ancillary activities among the multiple secondary user devices based on respective device functionalities of the multiple secondary' user devices. The 40 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 activity module 310 employs other factors to distribute the multiple ancillary activities such as the user preference.
[00199] To help illustrate the concepts described above, FIG. 11 illustrates non-limiting examples of generating the ancillary activity for the secondary user 5 device, according to some example embodiment. FIG. 11 includes a scene 1100 that depicts a living room attached to an open kitchen. The scene 1100 includes a smart television (TV) 1110, a media entertainment device 1120, a lamp 1.130, a mobile computer 1140, a wearable device 1150, a user 1160, a mobile device 1170, a smart refrigerator 1180, and a kitchen display? 1190. Each of the devices 10 of FIG. 11 can be attribute sources coupled to a network (e.g., the network 104) and operable to communicate with the data mesh system 150. In various example embodiments, the user 1160 may be carrying or wearing a smart device such as the wearable device 1150 (e.g., a mobile device, a wearable device, a NFC enable smart ring) on their person that provides real-time data 15 corresponding to the user 1160. For instance, the user 1160 may be carrying a mobile device that provides real-time location data (e.g., as determined by a GPS component, beacon location detect, or other location services).
[00200] in an example embodiment, the user 1160 may be browsing a webpage using the mobile computer i 140. The mobile computer 1140 can be 20 coupled to the network 104, and the activity module 310 detects the device activity of browsing a particular webpage. The preference module 320 infers a user preference associated with the device activi ty of browsing a particular webpage. For instance, the activity module 310 infers that the user has a preference for viewing supplemental content associated with the particular 25 webpage the user is browsing or supplemental functionality associated with the particular webpage (e.g., liking, favoriting, or sharing the particular webpage).
[00201] Based on the inferred user preference, the device module 330 identifies the secondary? user device such as the wearable device 1150 that is capable of performing the ancillary activity. The device module 330 identifies 30 the wearable device 1150 based on the wearable device 1150 being within an operating distance of the user 1160 ( the reasoning being that if the w earable device 1150 is within a short distance of the user 1160, then the wearable device 1150 is capable of providing information or receiving input from the user 1160). For example, the device module 330 determines the user 1160 is wearing the 41 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 wearable device 1150 based on various sensor data received from the wearable device 1150 (e.g., biometric data that indicates a particular user is wearing the device, accelerometer data that indicates the device is in use, BLUETOOTH® device detections that indicates the wearable device 1150 is within short range 5 communication range of another user device that the device module 330 has inferred the user is using).
[00202] Once the device module 330 identifies tire wearable device 1150, the activity module 310 generates the ancillary activity based on the device functionality of the identified secondary user device, the device activity, the user 10 preference, and other data. For example, the rvearable device 1150 may not have an audio output; in that case, content including an audio component can be altered or modified to accommodate the functionality of the wearable device 1150. In continuing with the example above, the activity module 310 generates the ancillary activity to include options to favorite, like, share (e.g., tweet) the 15 particular webpage the user 1160 is browsing. In some instances, the activity module 310 generates the ancillary activity based on the device activity. For example, if the device activity of browsing a particular webpage a lready includes options to favorite or share the webpage, the activity module 310 generates the ancillary activity to include functionality not already available at the user device 20 (the mobile computer 1140 in this case). After the activity module 310 generates the ancillary activity, the activity module 310 transmits or otherwise communicates to the wearable device 1150 to perform the ancillary activity. In this example, the activity module 310 transmits instructions to the wearable device 11.50 to present a user interface including the options to favorite, like, or 25 share the particular webpage the user 1160 is browsing on the mobile computer 1140.
[00203] FIGS. 12. and 13 depict example user interfaces for interactively presenting information to the user. Although FIGS. 12 and 13 depict specific example user interfaces and user interface elements, these are merely non- 30 limiting examples, and many other alternate user interfaces and user interface elements can be generated by the presentation module 210 and presented to the user. It will be noted that alternate presentations of the displays of FIGS. 12 and 13 can include additional information, graphics, options, and so forth; other presentations include less information, or provide abridged information for easy 42 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 use by the user.
[00204] FIG. 12 depicts an example device 1200 (e.g., smart watch) displaying an example user interface 1210 that includes ancillary or supplemental functionality. For example, the user interface 1210 includes user 5 interfaces elements 1220, 1230, and 1240 that provide the user with options to like (e.g,, like a particular webpage or a particular location), share (e.g., tweet a. link to a particular location or a webpage), or return to home (e.g., navigate a particular webpage on the user device using the secondary user device) respectively. For example, activating the user interface element 12.20 causes the 10 secondary user device to “like” an item associated with the device activity (e.g., a particular webpage) on a social network service the user is a member of. The user interface 1210 includes a wide variety of other functionality.
[00205] In some cases, the ancillary activity comprises a notification. In these cases, the activity module 310 causes presentation of the notification to the 15 user. For instance, the activity module 310 communicates, to the device 1200, instructions to present the notification. In some instances, the instructions include notification content, generated by the activity module 310, such as a message (e.g., pertinent information) to be presented to the user. In example embodiments, the notification comprises a text message, such as Short Message 20 Service (SMS) messages, Multimedia Messaging Service (MMS), Enhanced
Messaging Service (EMS), and so forth. In other example embodiments, the notification comprises a push notification or another similar type of notification. In further example embodiments, the notification comprises interactive user interface elements. 25 [00206] FIG. 13 depicts an example device 1300 displaying an example user interface 1310 that includes ancillary or supplemental content. For example, the user interface 1310 inehides supplemental content associated with directions to a particular location. For instance, if the user is attempting to physically locate a particular place using a smart phone, the ancillary activity' 30 includes providing additional or supplemental content on a wearable device the user may be wearing. In the example of FIG. 13, the user interface 1310 includes a heading determined via sensors included in the wearable device or from data received from the smart phone. The supplemental content can also include pertinent information such as a distance to the specified location. In this 43 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 way, the wearable device performs the ancillary activity associated with the de vice activity being performed in real time by the user device of the user.
[00207] Turning now to another example embodiment, FIG. 14 is a block diagram of the user analytics system 270, which may provide functionality to 5 identify items that the user has demand for and facilitate a purchase associated with the identified items. The user analytics system 270 may include an item module 1.410, an analysis module 1420, and an order module 1430.
[00208] The item module 1410 in the user analytics system 270 may provide functionality to facilitate identifying items from the attribute data. For 10 example, the item module 1410 extracts demand indications, from the attribute data, that may indicate anticipatory demand by the user for a particular item. In a specific example, the demand indications may indicate use of a particular item by the user, a user supply of a particular item, a user activity indicative of a particular item (e.g., frequent visits to a beach may be indicative of demand for 15 sunscreen products and other beach related products), and other indications of demand for various items. The item module 1410 extracts the demand indications from many different portions of attribute data and combinations of portions of attribute data using a variety of schemes and techniques.
[00209] The analysis module 1420 provides functionality to identity items 20 from the attribute data. For example, the analysis module 1420 identifies the commerce item or a pertinent item based on the demand indications, the user characteristics, the attribute data, or any suitable combination thereof. In further example embodiments, the analysis module 1420 calculates a demand metric based on the demand indications. In some implementations, the user analytics 25 system 270 performs a vari ety of tasks and functions based on the demand me tric, such as various aspects of facilitating the purchase associated with the commerce item.
[00210! The order module 1430 provides functionality to facilitate the user purchase associated with the commerce item. For example, the order module 30 1430 determines order parameters or transaction parameters for the user purchase based on the user characteristics, the attribute data, the demand indications, or other data. In some example embodiments, the order module 1430 automatically (e.g., without intervention or action of the user) performs the user purchase on behalf of the user based on various triggers or analyses. 44 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00211] FIG. 15 is a flow diagram illustrating an example method 1500 for identifying the commerce item from the attribute data and facilitating the user purchase associated with the commerce item. At operation 1510, the attribute module 230 receives attribute data associated with the user from a plurality of 5 attribute sources. In various example embodiments, at least a portion of the attribute data includes real-time data or near real-time data. The term “real-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the real-time data may include user input data or sensor data communicated to the attribute module 230 after a delay 10 interval (e.g., due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and the attribute module 230 receiving the data.
[00212] As wall be discussed in connection with FIGS. 52 and 53, the attribute data is received from a broad spectrum of attribute sources (e.g., 15 devices, sensors, servers, databases, and other sources). Additionally, the attribute module 230 may receive the attribute data via many pathways resulting from an assortment of configurations of the attribute sources as further discussed in connection with FIGS. 51A and 5IB. In an example embodiment, the attribute module 230 receives the attribute data directly from the attribute 20 sources. In other example embodiments, the attribute module 230 receives the attribute data from a central device that receives attribute data from a plurality of user devices. In still other example embodiments, various user devices are communicatively coupled in a decentralized device-to-device mesh, and the attribute module 230 receives the attribute data corresponding to a particular 25 device in the mesh from any of the devices in the mesh. In still other examples, the attribute module 230 receives the attribute data from the attribute sources in many other configurations including various suitable combinations of configurations.
[(1()213] in various example embodiments, the attribute module 230 stores 30 the attribute data in association with the user (e.g., indexed based on a user identifier) for subsequent analysis. The attribute module 230 may store the attribute data in a storage device such as the database(s) 126, for example. In some implementations, the attribute module 230 accesses the stored attribute data using a variety of search or find schemes. For instance, the attribute data 45 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 associated with a particular user is accessed using a user identifier that corresponds to the particular user. It will be noted that the collective, aggregated attribute data may be referred to as a “data mesh.” [00214] At operation 1520, the item module 1410 extracts demand 5 indications from the attribute data. In some example embodiments, the demand indications are indicative of anticipatory demand by the user for a particular item. For instance, a particular demand indication may indicate that the user may want, desire, or have an affinity for a particular product or commerce item.
It is noted that the terms “item,” “product,” “commerce item,” and the like are 10 intended to include a wide variety of products (e.g., items corresponding to item listings published on an e-commerce website) and services (e.g., a particular activity such going to a restaurant). It is also noted that the terms “anticipatory” and “predictive” as used herein are intended to pertain to future events, or activity, including events that are in the immediate future (e.g., events within a 15 short time period, such as minutes or seconds, of the present) as well as events further in the future (e.g., months or years from the present).
[90215] The item module 1410 extracts the demand indications from a ’wide variety of data included in the attribute data, such as purchase histories, location data (e.g., as determined by a GPS component of a mobile device, beacon 20 detections, or other location services), social media data (e.g., check-ins or postings by the user), as well as other data included in the attribute data as discussed herein. The demand indications include, for example, inventory level indications (e.g,, a food supply of the user as indicated by a smart refrigerator), item usage indications (e.g., user purchase history’ may indicate certain item 25 usage patterns), item activity indications, activity'· related to an item (e.g., the user spending time on the ski slopes may indicate a demand for ski equipment), user engagement data (e.g., the user clicking on particular links associated with various products or activities), and so forth. For instance, location data included in the attribute data may indicate frequent trips to coffee shops. In this instance, 30 the item module 1410 extracts the location data from the attribute data since it may be indicative of demand for coffee or coffee related products by the user. In another example, social media data, such as check-ins to a gym or postings about fitness activities, may indicate demand for fitness-related items or activities. [00216] In a. specific example, the user may currently possess a sufficient 46 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 supply of bottled water, but based on indications of consumption rates of bottled water by the user, the user may have future demand for the bottled water. Thus, in this example, the item module 1410 extracts supply indications for the bottled water (e.g., purchase history data of the user) or consumption indications for the 5 bottled water (e.g., inventory activity data as retrieved or accessed from a smart refrigerator) from the attribute data. That is to say, the demand indications, extracted by the item module 1410, may include the supply indications, inventory level indications, or inventory activity indication for the bottle water.
[00217] At operation 1530, the analysis module 1420 identifies the 10 commerce item, product, or pertinent item from the attribute data based on the extracted demand indications. For instance, the analysis module 1420 determines that there may be a high likelihood that the user is interested in or has demand for a particular item. In other words, the analysis module 1420 identifies the commerce item or multiple commerce items from among a 15 plurality of commerce items associated with the demand indications based on user demand for respective commerce items included in the plurality of commerce items.
[00218] The analysis module 1420 identifies the commerce item based on the demand indications using a variety of schemes and techniques. For example, 20 the analysis module 1420 may calculate the demand metric based on the demand indications. The demand metric may indicate a likelihood that the user has a demand for a particular item. For instance, the demand metric may be based on an occurrence count of demand indications that correspond to a particular item (e.g., a particular item with multiple corresponding demand indications may be 25 associated with a higher demand metric than a particular item with a single corresponding demand indication). In some example embodiments, the analysis module 1420 may identify the commerce item based on the calculated demand metric exceeding a threshold (e.g., a predefined or dynamically determined value). 30 [80219] In another example embodiment, the analysis module 1420 ranks, sorts, or otherwise orders at least a portion of the plurality of commerce items associated with the demand indications based on the demand metric. In this example embodiment, the analysis module 1420 identifies the first, a predefined number, or a dynamically determined number of the highest-ranking commerce 47 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 items included in the plurality of commerce items associated with the demand indications, either alone or in any suitable combination. For instance, the analysis module 1420 identifies the commerce item from among the plurality of commerce items associated with the demand indications based on a statistical 5 analysis such as a percentage (e.g., top ten percent of the ranked plurality of commerce items), analysis based on standard deviations away from a mean, or other statistical methods.
[0(1228] In further example embodiments, the demand indications are weighted such that higher weighted demand indications may be more influential 10 in the analysis module 1420 identifying the commerce item based on the demand indications. The weighting can be pre-defmed or dynamically determined based on a user feedback data (e.g., data that indicates whether the user actually had demand for the commerce item identified by the analysis module 1420). In some implementations, the feedback data is included in the attribute data 15 subsequent to the analysis module 1420 identifying the commerce item. In this way, the analysis module 1420 can adapt, learn, or evolve as more of the attribute data is received. In some example embodiments, the analysis module 1420 employs various machine-learning techniques to enhance identifying the commerce item based on the demand indications. Similar techniques may be 20 applied by the item module 1410 to extract the demand indications in the previous operation.
[00221] At operation 1540, the characteristic module 240 infers or directly measures user characteristics pertaining to the user from the attribute data. As discussed above in connection with operation 610, the characteristic module 240 25 can infer a variety of user characteristics from the attribute data using a variety of data. It will be understood that the discussion of operation 610 regarding the characteristic module 240 applies equally to operation 1540.
[00222] At operation 1550, the order module 1430 facilitates the user purchase or suggested transaction, for the user, associated with the commerce 30 item based, at least in part, on the user characteristics. Facilitating a particular purchase is intended to include actions such as automatically (e.g., without intervention or action of the user) performing the particular purchase on behalf of the user, causing presentation of a notification that includes the option to make the particular purchase, or other actions associated with facilitating the 48 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 particular purchase (e.g., causing presentation of an advertisement, adjusting item listing search results of the user to emphasize item listings associated with a particular item), in farther example embodiments, the order module 1430 determines various parameters associated with the user purchase (e.g., order 5 parameters or transaction parameters) based on the attribute data, the user characteristics, the demand indications, or other data, in the discussion below, additional aspects of facilitating the user purchase are described.
[00223] FIG. 16 is a flow diagram illustrating further operations for facilitating the user purchase based, at least in part, on an evaluation of an 10 inventory level, according to some example embodiments. At operation 1610, the item module 1410 extracts a current inventory level of the commerce item from the attribute data. For example, the item module 1430 extracts a quantity of the commerce item from the attribute data that includes inventory data received from a smart refrigerator. In other examples, the attribute data includes 15 inventory indications from sensors that directly monitor or measure the commerce item. In a specific example, brake pads in an automobile of the user can be monitored via a sensor operable to indicate a condition of the brake pads (e.g., needing replacement). In another example, another user that may be associated with the user may purchase a particular item for which the user has 20 demand. In this example, the user may no longer have demand for the particular item based on the purchase by the other user (e.g., shared shopping list between family members).
[00224] At operation 1620, the analysis module 1420 determines an inventory threshold for the commerce item by modeling usage of the commerce 25 item, based on the extracted current inventory level and the inferred user characteristics. In an example embodiment, the analysis module 1420 determines the inventory threshold such that when the current inventory level may be below the threshold, the inventory may need to be reordered to avoid depletion. For instance, the analysis module 1420 calculates a usage rate 30 corresponding to the commerce item based on the attribute data, the user characteristics, or other data and applies the usage rate to determine the inventory threshold for the commerce item to avoid depletion of a supply of the commerce item. For example, the analysis module 1420 determines the usage rate for a particular item based on historical purchase history data and infers 49 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 usage rate based on a frequency of purchase of the particular item. In a specific example, the commerce item may be coffee beans and the analysis module 1420 determines that based on the user’s current supply of coffee beans and the user’s rate of consumption (e.g., the usage rate) corresponding to coffee beans, the user 5 may run out in fourteen days. In this specific example, the analysis module 1420 determines the inventory threshold ίο be a value, such as a quantity of the coffee beans, that may be the current inventory level several days prior to depletion of the supply of coffee beans.
[00225] At operation 1630, the analy sis module 1420 identifies a mismatch 10 between the inventory threshold and the current, inventory level. For example, if the analysis module 1420 determines that the current inventory level is below the inventory threshold, the analysis module 1420 identifies the mismatch on that basis.
[00226] At operation 1640, the order module 1430 automatically (e.g., 15 without interventi on or action of the user) performs the user purchase on behalf of the user based on the mismatch, according to some implementations. In some example embodiments, the order module 1430 accounts for shipping delays and other delays so as to avoid the current inventory level of the commerce item falling below the inventory threshold. For instance, the analysis module 1420 20 increases the inventory' threshold to account for delays in receiving an order for a particular item.
[00227] FIG. 17 is a flow diagram illustrating further operations for facilitating the purchase including operations to determine parameters for the purchase, according to some example embodiments. At operation 1710, the 25 order module 1430 determines at least one order parameter based, at least in part, on the user characteristics. The order parameter may include at least one of a quantity, a delivery time, a payment time, a delivery method, a delivery destination, a merchant, a brand, a price, an item color, an item style, and so on. For example, the user characteristics may indicate that the user may wear a 30 certain garment size. In this example, the order module 1430 specifies order parameters for clothing or apparel according to the garment size. Similarly, the order module 1430 may specify a brand for the user purchase based on the user characteristics (e.g., historical brand purchases by the user or an analysis of the user’s style and the brands that conform to that style), in another example, the 50 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 user characteristics may indicate that the user may be minimizing cost above other considerations. In this example, the order module 1430 identifies a lower cost option as opposed to a speedier option (e.g., wait for a sale for a particular item or use the cheapest shipping). In another example, the user characteristics 5 may indicate that the shipping speed may be important for certain items (e.g., a trendy new mobile device that the user may want right away). In this example, the order module 1430 determines the delivery method parameter based on how fast the order may be delivered. In another example embodiment, the order module 1430 specifies the delivery location for the user purchase based on the 10 user characteristics (e.g., if the user purchase relates to items tor the user’s work, the delivery location may be the user’s work location rather than a home address). The order module 1430 may determine many other order parameters based on the user characteristics.
[00228] At operation 1720, the order module 1430 facilitates the purchase 15 according to the determined order parameters. For example, the order module 1430 may recommend the user purchase to the user, with the recommendation including the determined order parameters (e.g., providing the user with a notification including the option to make the user purchase), in another example, the order module 1430 automatically makes the user purchase on 20 behalf of the user. For inst ance, if the order module 1430 determines that the user purchase may be urgent (e.g., to avoid depletion of a supply of a certain item), routine (e.g., purchasing bottled water), or specified by the user as a permissible automatic purchase in advance, the user purchase may be made automatically to avoid burdening the user with tire decision to make the user 25 purchase.
[00229] FIG. 18 is a flow diagram illustrating further operations for determining order parameters including operations to determine a temporal parameter associated with the purchase, according to some example embodiments. At operation 1810, the analysis module 1420 identities a 30 purchase motive of the user for the commerce item by analyzing the user characteristics, in various example embodiments, the purchase motive corresponds to a motive time. For example, the user may be planning a vacation that includes a beach destination. The vacation may be indicative of the purchase motive of the user for items associated with the vacation (e.g., 51 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 sunscreen for a beach type vacation, snacks for a road trip type vacation, Broadway tickets for a New York city trip). In this example, the motive time corresponds to the beginning of the vacation (e.g., as determined by user calendar information included in the attribute data or purchase history data such 5 as plane ticket information).
[80238] At operation 1820, the order module 1430 determines temporal order parameters based on the motive time. .In the example above, where the user may be planning a vacation, the order module 1430 determines the temporal order parameters so that the items corresponding to the user purchase arrive prior ί 0 to the vacation. In another example, if the purchase motive is associated with an event, such as a graduation patty, the order module 1430 may determine the temporal order parameters such that the items associated with the user purchase are delivered prior to the graduation party, since the user may no longer have demand for the particular item after a certain time. 15 [00231] At operation 1830, the order module 1430 facilitates the purchase according to the determined temporal order parameters. For example, the order module 1430 schedules the items corresponding to the user purchase to arrive at a certain time.
[00232] FIG. 19 is a flow diagram illustrating further operations to facilitate 20 the purchase based, at least in part, on purchase criteria, according to some example embodiments. At operation 1910, the analysis module 1420 accesses purchase criteria corresponding to the user. The purchase criteria, for example, include predefined criteria, user-specified criteria, dynamically determined criteria, either alone or any suitable combination thereof. For example, the 25 purchase criteria may include temporal-based criteria (e.g., criteria that specifies making the user purchase with certain time periods), budget criteria (e.g., spending limits associated with particular items or categories of items), context-based criteria (e.g., adjusting the budgeting criteria based on the user’s current location), among other purchase criteria. In a specific example, the user 30 specifies a budget for a particular category or good (e.g., transportation, food, utilities, housing, entrainment, travel, health), a total budget, or a monthly budget. In addition, the user can specify rules-based criteria such as a particular time to make certain purchases (e.g., after paycheck is deposited).
[00233] In further example embodiments, the analysis module 1420 52 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 identifies, and includes, in the purchase criteria, patterns in the user's purchasing habits, objectives, goals, or dynamically generated criterion. In a specific example, the analysis module 1420 determines that the user may be afflicted with a medical condition, such as a peanut allergy, in this scenario, the analysis 5 module 1420 includes a criterion in the purchase criteria to avoid items that contain peanuts. In another example, the analysis module 1420 determines that the user may be attempting to maintain a vegan diet and the analysis module 1420 may avoid food items that are contrary to the goal of maintaining a vegan diet. 10 [00234] At operation 1920, the order module 1430 automatically purchases the commerce item on behalf of the user according to the purchase criteria. In an example embodiment, the order module 1430 determines satisfaction of the purchase criteria prior to facilitating the user purchase for the user. For example, the order module 1430 determines that a particular budget criterion included in 15 the purchase criteria has been exceeded and the order module 1430 may not perform the user purchase on that basis. In other words, the order module 3430 facilitates the user purchase based on the determined satisfaction of the purchase criteria.
[00235] Turning now to FIG. 20, the data mesh system 150 may implement 20 various suitable combinations of the operations discussed above to identify the commerce item and facilitate the user purchase. T his applies equal ly to the operations discussed above as well as the operations in the following discussion. FIG. 20 is a flow diagram illustrating an example method 2000 of one such combination of the operations, although many other suitable combinations may 25 be employed. At the operation 1510, the attribute module 230 receives the attribute data associated with the user. At the operation 1520, the item module 1410 extracts the demand indications from the attribute data. At the operation 1530, the analysis module 1420 identifies the commerce item from the attribute data based on the demand indications. At the operation 1540, the characteristic 30 module 240 infers the user characteristics from the attribute data. As shown in FIG. 20, various combinations of the above operat ions may be employed to facilitate the user purchase at the operation 1550.
[0Θ236] In the example method 2000, to facilitate the user purchase, the item module 1410 extracts the current inventory level of the commerce item at 53 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 the operation 1610. As described above, at the operation 1620, the analysis module 1420 determines the inventory threshold for the commerce item. Subsequently, at the operation 1630, the analysis module 1420 identifies the mismatch between the inventory threshold and the current inventory level. At 5 the operation 1640, if the analysis module 1420 identifies the mismatch, the user analytics system 270 may proceed to the operation 1710. Alternatively, if the analysis module 1420 does not identify the mismatch, no subsequent operation may be performed.
[80237] Subsequent to determining the mismatch, at the operation 1710, the 10 order module 1430 determines the order parameters of the user purchase based on the user characteristics, in an example embodiment. In some example embodiments, this may involve the operations 1810, 1820, and 1830, respectively, to determine the temporal order parameters that may be included in the order parameters. At the operation 1720, the order module 1430 facilitates 15 the user purchase according to the order parameters.
[88238] Finally, at the operation 1910, the analysis module 1420 accesses the purchase criteria, and at the operation 1920, the order module 1430 facilitates the user purchase according to the purchase criteria. T hus, FIG. 20 shows an example embodiment where various ones of the above operations may be 20 employed in conjunction with each other to facilitate the user purchase.
[00239] FIG. 21 is a flow diagram illustrating an alternative example method 2100 for identifying the commerce item and facilitating the user purchase, according to some example embodiments. The example method 2100 may involve similar operations as those described above. At operation 2110, 25 similar to the operation 1510, the attribute module 230 receives or accesses the attribute data associated with the user. At operation 2120, similar to the operation 1540, the characteristic module 240 infers the user characteristics pertaining to the user from the attribute data.
[00240] At operation 2130, the analysis module 1420 identifies similar 30 users that are similar to the user based on the inferred user characteristics and respective user characteristics of a plurality of other users. The analysis module 1420 identifies similar users that are similar to the user based on a variety of factors. In some example embodiments, the analysis module 1420 accesses the attribute data or stored user characteristics corresponding to the plurality of other 54 PCT/U S2015/022318 WO 2015/148559 2016101802 14 Oct 2016 users. For example, the analysis module 1420 identities the similar users from among the plurality of other users that are similar to the user based on the inferred user characteristics of the user and respective user characteristics of the plurality of other users. The analysis module 1420 may correlate, match, or 5 otherwise compare the inferred user characteristics with respective user characteristics of the plurality of other users to identity the similar users. In various example embodiments, the analysis module 1420 identifies the similar users based on the same or similar demographic data (e.g., same or similar age, marital status, gender, geographic location, etc.), same or similar user 10 characteristics (e.g., same or similar brand purchases), same or similar attribute data, and so on.
[00241] At operation 2140, the analysis module 1420 identifies the commerce item from the attribute data based on the user charact eristics of the similar users and the demand indications. For example, the demand indications 15 may indicate a particular item that may not be particularly significant based on the demand indications (e.g., the demand metric may be particularly low for the particular item). However, the analysis module 1420 may identify this particular item based on the user characteristics of the similar users indicating that the particular item may be of significance. In other words, although the demand 20 indications did not show a strong demand for the particular item, the user characteristics of the similar users indicated that the user may have strong demand for the particular item.
[00242] In a specific example, the demand indications may indicate the user has demand for a pair of sunglasses. The demand indications in this example 25 may further indicate the user may he interested in brands X, Y, and Z with a particular emphasis on brand X. The user characteristics of the similar users (e.g,, user of the same or similar age, location, gender, other demographic information, or similar purchasing preferences) may indicate that brand Z may be in high demand for the users similar to the user. On that basis, the analysis 30 module 1420 may identify brand Z sunglasses as the commerce item.
[00243] At operation 2150, the order module 1430 determines the order parameters or transaction parameters based on the user characteri stics of the similar users. For example, the delivery method may be determined based on the user characteristics of the similar users. For instance, if the similar users 55 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 frequently choose a speedy delivery method for a particular item (e.g., new electronics), the order module 1430 may determine a speedy delivery method corresponding to the user purchase for the commerce item that may be the same or similar to the particular item. 5 [00244] Similarly, the purchase criteria may include dynamically determined criterion based on the user characteristics of the similar users. That is to say, the analysis module 1420 may dynamically generate a portion of the purchase criteria based on the similar users. For example, a default budget for particular categories of items may be determined based on an analysis of the user 10 characteristics of the similar users (e.g., other users with similar demographic information as the user may on average spend a certain amount per category of good).
[00245] At operation 2160, the order module 1430 facilitates the user purchase associated with the commerce item according to the determined order 15 parameters. As discussed above, the order module 1430 facilitates the user purchase in a variety of manners including automatically performing the user purchase on behalf of the user or causing presentation of a notification to the user that includes the option to make the user purchase according to the order parameters. 20 [00246] FIG. 2.2 is a flow diagram illustrating further operations to facilitate the purchase based, at least in part, on the demand metric, according to some example embodiments. At operation 2210, the analysis module 1420 calculates the demand metric of the identified item based on the demand indications corresponding to the identified item. 25 [00247] At operation 2220, the order module 1430 facilitates the user purchase based on the demand metric. For example, if the order module 1430 determines that the demand metric is high (e.g., exceeds a predefined or dynamically determined threshold) then the order module 1430 may facilitate the user purchase with more urgency than for a lower demand metric. For instance, 30 the order module 1430 automatically performs the user purchase for the user based on a high demand metric or more frequently causes presentation to the user of the notification that includes the option to make the user purchase (or with more emphasis such as a more conspicuous notification such as a larger user interface presentation to the user). In some instance, the order module 1430 56 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 determines the order parameters based on the demand metric. For instance, if the order module 1430 determines that the demand metric is high, then the order module 1430 may subsequently determine a speedier delivery option for the commerce item. 5 [00248] FIG. 23 is a flow diagram illustrating further operations to facilitate the purchase using a notification, according to some example embodiments. At operation 2310, the presentation module 210 generates the notification that includes an option to make the user purchase. The notification may include a user interface, a text message (Short Message Service (SMS), Multimedia 10 Messaging Service (MMS), Enhanced Messaging Service (EMS), other messaging modalities), and so on. In some example embodiments, the presentation module 210 may generate the content of the notification based on the commerce item, the user characteristics, the user characteristics of the similar users, tire attribute data, and so forth. In various example embodiments, the 15 notification may include the order parameters tor the user purchase.
[00249] At operation 2320, the presentation module 210 causes presentation of tire generated notification to the user. For example, the presentation module 210 may communicate instructions to present a user interface that includes the notification to a device of the user. In some example embodiments, the 20 presentation module 210 may determine the device of the user to present the notification of the user based on the user characteristics. For example, if the user has a preference for a particular device (e.g., a mobile device of the user), the presentation module 210 may cause presentation of the notification to that device, in further example embodiments, the notification may provide the user 25 the option to specify or modify the order parameters of the user purchase.
[00250] At operation 2330, the presentation module 210 receives a user selection of the option to make the purchase. For example, if the user chooses to make the user purchase, the presentation module 210 may receive the user selection of the option to make the purchase and communicate the selection to 30 the order module 1430 to perform the user purchase.
[00251] At operation 2340, the order module 1430 performs the purchase, according to some example embodiments. For example, the order module 1430 may make the user purchase according the order parameters on behalf of the user. 57 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00252] FIG. 24 is a flow diagram illustrating further operations for presenting a notification, according to some example embodiments. At operation 2410, the presentation module 210 identifies presentation parameters for presentation of the notification. For example, the presentation modu le 210 5 may identify the presentation parameters based on the user characteristics, the user characteristics of the similar users, the attribute data, the demand indications, or other data. The presentation parameters may include a preferred device of the user to present the notification, preferred time of day to present the notification, content preferences (e.g., do not present notifications regarding 10 particular item categories), and so on. In a specific example, the user characteristics may indicate a work time period for the user. In this example, the notification may not be presented to the user during the work time period as the user may not respond. In another example, the analysis module 1420 may identity a device status of a particular user device and, based on the device 15 status, the presentation module 210 may route the notification to another device.
For instance, if the device status indicates that the device is inactive (e.g., being charged), the presentation module 210 may cause presentation of the notification to another device (e.g., an active device as determined by device sensors).
[00253] At operation 2420, the presentation module 210 causes presentation 20 of the notification according to the presentation parameters. For instance, the presentation module 2 ί 0 may cause presentation of the notification to a preferred device of the user at a time of day that the user is likely to respond to the notification as determined based on an analysis of the user characteristics.
[00254] FIG. 25 is a flow diagram illustrating further operations for 25 presenting a notification, according to some example embodiments. At operation 2510, the analysis module 1420 detects a trigger action of the user based on the real-time data included in the attribute data. For example, the analysis module 1420 may determine that the user may be moving into the kitchen (e.g., as determined by a BLUETOOTH® handshake between a mobile 30 device the user may be wearing and a smart appliance located hi the user’s kitchen), which may be a good time to notify the user regarding food supplies.
[00255] At operation 2520, the presentation module 210 causes presentation of the notification to the user in response to detecting the trigger action. In other words, based on the analysis module 1420 detecting the trigger action of the 58 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 user, the presentation module 210 may cause presentation of the notification to the user. 100256] FIG. 26 is a flow diagram illustrating an example method 2600 showing communication between various devices in relation to presenting a 5 notification to the user, according to some example embodiments. At operation 2610, attribute source 2602 communicates the attribute data to the data mesh system 150. As described above, at the operation 151.0, the attribute module 230 may receive the attribute data associated with the user. At the operation 1520, the item module 1410 extracts the demand indications from the attribute data. 10 At the operation 1530, the analysis module 1420 may identify the commerce item from the attribute data based on the demand indications. At the operation 1540, the characteristic module 240 infers the user characteristics from the attribute data.
[00257] In the example method 2600, the order module 1430 facilitates the 15 user purchase by generating the notification at the operation 2620. The presentation module 210 communicates the notification from the data mesh system 150 to the user device 2606. At operation 2630, the user device 2606 may receive the notification and present the notification to the user. Subsequently, the user may select an option to make the user purchase. The user 20 device 2606 may communicate an indication of the user selection to make the user purchase at the operation 2640. At operation 2650, the data mesh system 150 may receive the user selection to make the user purchase. Finally, the order module 1430 may perform the user purchase in response to receiving the user selection to make the user purchase at operation 2660. 25 [00258] FIG. 27 depicts an example user interface 2700 to facilitate the purchase, according to some example embodiments. It will be noted that alternate present ations of the displays of FIG. 27 may include additional information, graphics, options, and so forth; other presentations may include less information, or may provide abridged information for easy use by the user. 30 Notification 2710 may be text messages, such as Short Message Service (SMS) messages, Multimedia Messaging Service (MMS), Enhanced Messaging Service (EMS), and other messaging modalities, which may be provided to notify the user of the user purchase including the order parameters. In other example embodiments, the notification 2710 may be a push notification or similar type of 59 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 notification. Some notifications may be interactive, enabling the user to make a selection through the SMS system, mobile application, or other method. For instance, the user may interact with the notification 2710 using user interface element 2720. 5 [00259] To help illustrate the concepts described above, FIGS. 28 and 29 illustrate non-limiting examples of identifying the commerce item and facilitating the user purchase associated with the commerce item, according to some example embodiment. Referring now to FIG. 28, a scene 2800 depicts a living room attached to an open kitchen. In the example of FIG. 28, the scene 10 2800 includes a media entertainment device 2810, a smart television (TV) 2820, a lamp 2830, a mobile computer 2840, a mobile device 2850, a user 2860, a smart refrigerator 2870, and a kitchen display 2880. Each of the devices 2810-2850, 2870, and 2880 may be attribute sources coupled to a network (e.g., the network 104) and operable to communicate with the data mesh system 150. In 15 various example embodiments, the user 2860 is carrying a smart device (e.g., a mobile device, a wearable device, a near field communication (NFC) enabled smart ring) on their person that may provide real-time data corresponding to the user 2860. For instance, the user 2860 may be carrying a mobile device that may provide real-time location data (e.g., as determined by a GPS component, beacon 20 location detect, or other location services). In this way, the analysis module 1420 tracks, monitors, or otherwise observes the location of the user 2860 via a particular device the user is wearing, or the location of the user 2860 may be derived from various real -time data associated with the user’s location included in the attribute data (e.g., BLUETOOTH® handshakes between a device the user 25 is wearing and another device with a known or fixed location).
[00260] In an example embodiment, the lamp 2830 is a smart lamp operable to communicate various operating data to the data mesh system 150 or connected to a smart outlet operable to monitor the functionality of the lamp 2830. In this example embodiment, the item module 1410 extracts demand indications from 30 portions of the attribute data corresponding to the lamp 2830. For example, the demand indications may indicate use of the lamp 2830 in a particular way (e.g., the user 2860 may use a low brightness setting on the lamp 2830) or that the light bulb of the lamp 2830 has burned out. The analysis module 1420 identifies the commerce item as a light bulb that needs to be replaced based on the demand 60 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 indications (e.g., detected via sensors of the lamp 2830 or derived via data from a smart outlet such as reduced power consumption indicating a burnt -out light bulb). Subsequently, the order module 1430 may notify the user of the burnt-out light bulb with an option to reorder a particular light bulb based on the user 5 charac teristics (e.g., purchase history of the user ). In some ins tance, the order module 1430 may automatically reorder the light bulb without notifying the user. In various example embodiments, the order module 1430 automatically performs the user purchase based on the purchase criteria (e.g., for this particular category of goods, simply place an automatic order). 10 [90261] in another example embodiment, the smart refrigerator 2870 communicates inventory data to the data mesh system 150. For example, the smart refrigerator 2870 may communicate food supply data. In this example embodiment, the item module 1410 extracts demand indications from the food supply data. Subsequently, the analysis module 1420 identifies the commerce 15 item based on the demand indications. For instance, the analysis module 1420 may identify milk as the commerce item based on a low inventory level of milk. The order module 1430 determines a quantity of milk to order based on the user characteristics (e.g., historical purchase data for milk during the current season of the year). The order module 1430 may then generate a notification that 20 includes an option to purchase milk. The order module 1430 causes presentation of the notification based on the user characteristics. For instance, the real-time data included in the attribute data may indicate the user 2860 is currently in the kitchen, which may be a good time to provide the user 2860 the option to reorder milk (the reasoning being that the user 2860 may be able to inspect the food 25 supply first). In a further instance, the order module 1430 may determine that the status of the mobile device 2850 is inactive (e.g., toned off or not in use based on a lack of movement detected from device accelerometers). In this scenario, the order module 1430 may cause presentation of the notification to the user on another device such as display 2880. 30 [00262] FIG. 29 illustrates an example of identifying an item and facilitating a purchase associated with the identified item, according to some example embodiments. The scene 2900 depicts a city including the user 2910 driving in a car. In this example, the item module 1410 extracts demand indications such as a location of the user 2910 or a route of the user 2910 that 61 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 may be indicative of a destination 2950 and thus, the commerce item. In continuing with this example, the analysis module 1420 determines the destination 2950 of the user 2910 from the demand indications or based on, for example, a route 2930 the user 2910 is taking, a time of day, and a day of the 5 year. In other words, the analysis module 1420 may determine the destination 2950 of the user 2910 based on the demand indications and the user characteristics or real-time context data included in the attribute data (e.g., location as determined by a GPS component of a mobile device). In some example embodiments, the analysis module 1420 or the characteristic module 10 240 determines the real-time location of the user 2910 based on BLUETOOTH® or other close range communication detections within a radius such as radius 2920. For instance, the analysis module 1420 determines that the user 2910 may be at the destination 2950 if the user 2910 is w ithin the radius 2940 of the destination 2950. in this scenario, the destination 2950 may be a coffee shop 15 and the commerce item may be a cup of coffee, in some example embodiments, the order module 1430 automatically places an order for the cup of coffee, or presents a notification with the option to place the order for the coffee, while the user 2910 may be en route. In further example embodiments, the order module 1430 determines the order parameters based on the user characteristics such as 20 past orders for coffee included in a purchase history of the user 2910. In still further example embodiments, the presentation module 210 determines the presentation parameters based on the user 2910 being in the car (e.g., present an audio alert for the option to place the order and receive a vocal command from the user 2910 to place the order). 25 [00263] Turning now' to another example embodiment, FIG. 30 is a block diagram of the augmentation system 280, which may provide functionality to authenticate the identity of the user, identify the user activity, and augment the user activity. The augmentation system 280 may include an authentication module 3010, an activity module 3020, and a settings module 3030. 30 [00264] The authentication module 3010 in the augmentation system 280 may provide functionality to facilitate authenticating a user’s identity. For example, the authentication module 3010 may identify a portion of the attribute data indicative of the identity of the user. Subsequently, the authentication module 3010 may authenticate the identity of the user by analyzing the identified 62 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 portion of the attribute data. In further example embodiments, the authentication module 3010 may calculate an identity likelihood metric based on real-time data included in the attribute data. The identity likelihood metric may indicate the likelihood of authenticating the identity of the user (e.g., a higher identity 5 likelihood metric may indicate a strong probability that the user’s identity may be authenticated). The authentication module 3010 may analyze a variety of portions of the attribute data using many different schemes and techniques to authenticate the identity of the user.
[80265] The activity module 3020 may provide functionality associated 10 with, user activities. For example, the activity module 3020 may identify a user activity being performed by the user based on the attribute data included in the real-time data. In a further example, the activity module 3020 may facilitate augmentation of the identified user activity according to a user setting or user preference. For instance, the user setting may be associated with presenting 15 information to the user (e.g., the user may have a desire to use a larger screen for presenting information if available). In this instance, the activity module 3020 may augment presentation of information to the user based on the user setting. [88266] The settings module 3030 may provide functionality' to access or determine one or more user settings. For example, the settings module 3030 20 may' determine the user setting based on the attribute data and the identified user activity. In some example embodiments, the settings module 3030 may access the user setting from a storage device (e.g., the database(s) 126). In further example embodiments, the settings module 3030 may determine the user setting based on an analysis of the user characteristics, similar users, an augmentation 25 result from augmenting the user activity, and so forth.
[00267] FIG. 31 is a flow diagram illustrating an example method 3100 for authenticating the user and augmenting the user activity, according to some example embodiments. The operations of the method 3100 may be perforated by components of the data mesh system 150 and the augmentation system 280. 30 At operation 3110, the attribute module 230 may receive the attribute data associated with the user from a plurality of attribute sources. In various example embodiments, at least a portion of the attribute data may include real-time data or near real-time data. The term ‘Teal-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the 63 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 real-lime data may include user input data or sensor data communicated to the attribute module 230 after a delay interval (e.g,, due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and the attribute module 230 receiving the data, 5 [8(5268] As will be discussed in connection with FIGS. 52 and 53, the attribute data may be received from a broad spectrum of attribute sources (e.g., devices, sensors, servers, databases, and other sources). Additionally, the attribute module 230 may receive the attribute data via many pathways resulting from an assortment of configurat ions of the attribute sources as further discussed 10 in connection with FIGS. 51A and 51B. in an example embodiment, the attribute module 230 may receive the attribute data directly from the attribute sources. In other example embodiments, the attribute module 230 may receive the attribute data from a central device that receives attribute data from a plurality of user devices. In still other example embodiments, various user 15 devices may be communicatively coupled in a decentralized device-to-device mesh, and the attribute module 230 may receive the attribute data corresponding to a particular device in the mesh from any of the devices in the mesh. The attribute module 230 may receive the attribute data from the attribute sources in many other configurations including various suitable combinations of 20 configurations.
[88269] In various example embodiments, the attribute module 230 may store the attribute data in association with the user (e.g., indexed based on a user identifier) for subsequent analysis. The attribute module 230 may store the attribute data in a storage device such as the database^) 126, for example. The 25 attribute module 230 may access the stored attribute data using a variety of search or find schemes. For instance, the attribute data associated with a particular user may be accessed using a user identifier that corresponds to the particular user. It will be noted that the collective, aggregated attribute data may be referred to as a “data mesh.” 30 [80270] At operation 3120, the authentication module 3010 may identify a portion of the real -time data indicative of an identity of the user. The attribute data including the real-time data may comprise a vast amount of data associated with the user. All or various portions (e.g., segments or pieces) of the real-time data may be indicative of the identity of the user. The authentication module 64 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 3010 may identify, extract, parse, or otherwise obtain data from the real-time data that is pertinent, relevant, or otherwise useful in authenticating the identity of the user.
[00271] In an example embodiment, various devices that provide the real- 5 lime data to the attribute module 230 may correspond to the user. For instance, one or more user devices (e.g., mobile device, wearable device) may provide at least a portion of the real-time data. The user devices and the real-time data provided by the user devices may be identified via a device identifier such as an Internet Protocol (IP) address, a Media Access Control (MAC) address, other 10 unique identifies, an International Mobile Station Equipment identity (IMEI), a
Mobile Equipment Identifier (MEID), and so forth. In various example embodiments, the authentication module 3010 may identify portions of the real-time data indicative of the identity of the user by matching de vice identifiers corresponding to the user with respective device identifiers associated with the 15 real-time data. In a specific example, location data (e.g., as determined by a GPS component of a mobile device) may be identified by the authentication module 3010 as indicative of the identity of the user if the location data originated from a device having a device identifier corresponding to the user (e.g., the location data originated from the user’s mobile device). To be clear, 20 data that originates from a de vice corresponding to the user may merely be indicative of the identity of the user rather than identify the user as another user that may be operating the device. The identity of the user may be authenticated with respect to the real-time data in subsequent operations discussed below.
[00272] In various example embodiments, the attribute data and the real- 25 time data may include sensor data. In an example embodiment, the authentication module 3010 may identify portions of the sensor data that may be indicative of the identity of the user. For instance, the sensor data may? include biometric data such as a fingerprint scan, vocal sample, electroencephalogram, or retinal scan (refer to FIG. 53 for additional sensor data). In this instance, the 30 authentication module 3010 may identify the biometric data as being indicative of the identity of the user (e.g., matching fingerprints included in the sensor data with fingerprints of the user or matching another sensor signature with a sensor signature of the user ). In these particular example embodiments and the following example embodiments, the real- time data need not necessarily? have 65 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 originated from a device corresponding to the user.
[00273] in further example embodiments, the authentication module 3010 may identify portions of the attribute data indicative of the identity of the user based on various analyses or patterns. For example, a particular device may 5 provide location data (e.g., as determined by a GPS component of a mobile device). The authentication module 3010 may determine that the location data may be indicative of the user identity based on past location data of the user. In a specific example, if the user has, in the past, exhibited a particular pattern of travel or has frequently visited a particular location (e.g., a particular route 10 home, or spent a particular amount of time in certain location s) then the authentication module 3010 may identify the real-time data corresponding to the particular device providing the location data as indicative of the user identity.
[00274] In still further example embodiments, the authentication module 3010 may employ many other analyses to identify the attribute data indicative of 15 the identity of the user. For example, the user may be a member of various websites (e.g., e-commerce websites, social website.). If the user logs into a particular website using a particular device, the authentication module 3010 may identify the particular device and the attribute data received from the particular device as indicative of the user identity. 20 [00275] At operation 3130, the authentication module 3010 may authenticate the identity of the user with respect to the real-time data by analyzing the identified portion of the real-time data and the attribute data. In some example embodiments, authenticating the identity of the user with respect to the real-time data may establish that the real-time data resulted fr om an action 25 of the user. For instance, if the authentication module 3010 authenticates the identity of the user with respect to location data included in the real-time data, the location data may? indicate the current location of the user. The authentication module 3010 may analy ze many different portions of the attribute data using a variety of analysis schemes and techniques to authenticate the 30 identity of the user. The following discussion merely provides non-limiting examples of the authentication module 3010 authenticating the identity of the user based on the attribute data.
[00276] In an example embodiment, the authentication module 3010 may identify a portable device (e.g., mobile device, wearable device) corresponding 66 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 to the user from the real-time data, and use the identified portable device as a basis for authenticating the identity of the user with respect to the real-t ime data. For example, the authentication module 3010 may calculate the identity likelihood metric based on the identified portable device. The identity likelihood 5 metric may indicate the likelihood that the identified portion of the attribute data identifies the user, in a specific example, the user may be operating a computer, carrying a mobile device, and wearing a smart watch. In this example, the attribute module 230 may recei ve the real-time data from each of the devices. If the authentication module 3010 determines that the mobile device and the smart 10 watch correspond to the user (e.g., by matching respective device identifiers with device identifiers corresponding to the user), then the authentication module 3010 may authenticate the identity of the user with respect to the real-time data received from those devices, the reasoning being that if a person is determined to be carrying one or more devices belonging to the user, the person may be the 15 user (e.g., the identified portable devices implicate or suggest the person may be the user). The more devices of the user that the person may be carrying, the stronger the basis may be that the person is the user.
[00277] in continuing with the above example, the authentication module 3010 may authenticate the identity of the user with respect to the real-time data 20 corresponding to the computer based on the authentication with respect to the mobile device and wearable device. The authentication module 3010 may perform this authentication, for example, based on the location of the computer relative to the location of the portable devices of the user. For example, if the location of the computer is the same or within a short distance (e.g., an arm's 25 length) of the location of the portable devices of the user, the authentication module 3010 may infer that the user is using the computer. The location of the computer may be established based on sensor data included in the real -time data such as near field, BLUETOOTH®, or other interactions between the portable devices and the computer. In other words, the location of a particular portable 30 device may be known based on a GPS component of the particular portable device, and the location of a device in communication with the particular portable device may be inferred based on short-range communications that operate over short distances. Thus, in this example, the authentication module 3010 may authenticate the identity of the user with respect to the real-time data 67 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 received from the computer. The above is merely a non-limiting example, and many other techniques may be employed by the authentication module 3010 to authenticate the identity of the user with respect to the real-time data of various devices. 5 [00278] In further example embodiments, the authentication module 3010 may use other indications of the identity of the user to authenticate the identity of the user with respect to the real-time data. For example, the authentication module 3010 may use sensor data, at least in part, to authenticate the identity of the user. For instance, the sensor data may include biometric data that the 10 authentication module 3010 may use to authenticate the identity of the user. In a specific example, the biometric data may include biometric identification data such as fingerprint scans, vocal samples, retinal scans, facial scans, or electroencephalogram data (see FIG. 53 for additional biometric identification data). The authentication module 3010 may match, correlate, or otherwise 15 determine that the biometric data corresponds to the user to authenticate the identity of the user with respect to the real-time data.
[00279] In still further example embodiments, the authentication module 3010 may use location data to authenticate the identity of the user with respect to the real-time data. For example, the location data (e.g., as determined by a GPS 20 component of a mobile device) may indicate a location pattern that the authentication module 3010 may use to authenticate the identity of the user, in this example, the location pattern may comprise being at a particular location at a particular time or a particular route at a particular time. In a specific example, the location data may indicate a location that may be the home of the user. 25 Given that the location may be the home of the user, it may be probable that the real-time data provided by the mobile device may correspond to the user. Thus, in some example embodiments, the authentication module 3010 may authenticate the identity of the user with respect to the real-time data based on a location corresponding to the real-time data that may be indicative of the user 30 (e.g., a location that may be typically access-restricted for which the user has access such as a home or office).
[00288] In continuing with the discussion of the operation 3130, FIG. 32 is a flow diagram illustrating further example operations of the method 3i00 of FIG. 31, according to some example embodiments. As described above. 68 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 subsequent to the operation 3120, at the operation 3130, the authentication module 3010 may authenticate the identity of the user by analyzing the identified portion of the real-time data. In addition, at operation 3210, the authentication module 3010 may calculate the identity likelihood metric based on the identified 5 portion of the attribute data. The identity likelihood metric may indicate the likelihood that the identified portion of the attribute data identifies the user (e.g., the probability that the identifi ed portion of the real-time data identifies the user).
[00281] The authentication module 3010 may calculate the identity 10 likelihood metric using a variety of schemes and techniques. In an example embodiment, the authentication module 3010 may weight various portions of the real-time data indicative of the identity of the user. For example, the authentication module 3010 may more heavily weight the real-time data that strongly indicates the identity of the user (e.g., a fingerprint scan that matches a 15 fingerprint scan of the user). Conversely, the authentication module 3010 may weight less heavily the real-time data that does not strongly indicate the identity of the user (e.g., real-time data indicating a user device in a particular location at a particular time may be indicative of the identity of the user but may not as strongly implicate the identity of the user as biometric identification data). In 20 some example embodiments, the authentication module 3010 may use a combination of the real-time data to calculate the identity likelihood metric.
[00282] Subsequent to calculating the identity likelihood metric, at decision 3220, the authentication module 3010 may determine whether the identity likelihood metric exceeds an authentication threshold. In an example 25 embodiment, when the identity likelihood metric exceeds the authentication threshold, the authentication module 3010 may authenticate the identity of the user. Conversely, if the identity likelihood metric does not exceed the threshold, the authentication module 3010 may not authenticate the identity of the user, and no further operations may be performed in method 3100. 30 [00283] At operation 3230, the authentication module 3010 may authenticate the identity of the user. As discussed above, the authentication module 3010 may use the identity likelihood metric exceeding the authentication threshold as a factor in authenticating the identity of the user with respect to the real-time data. The authentication module 3010 may use the identity likelihood 69 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 metric alone or in conjunction with other factors to authenticate the identity of the user. Once the identity of the user is authenticated with respect to the realtime data, subsequent operations of the method 3100 may be performed.
[ΘΘ284] In further discussing the operation 3130, FIG, 33 is a flow diagram 5 illustrating a further embodiment for authenticating the identity of the user, according to some example embodiments. Subsequent to the operation 3120, at the operation 3130, the authentication module 3010 may authenticate the identity of the user by analyzing the identified portion of the attribute data. In addition, at operation 3310, the authentication module 3010 may derive, extract, or 10 otherwise obtain past identification indications from past attribute data. For example, if the past attribute data includes location data, the authentication module 3010 may extract favorite or frequent locations corresponding to the user.
[00285] At operation 3320, the authentication module 3010 may derive, 15 extract or otherwise obtain real-time identification indications from the real-time data. For instance, the real-time data may include location data. In some example embodiments, the real-time identification indications may be derived. For example, the authentication module 3010 may derive location information based on short-range communications with devices of a known or fixed location. 20 [0(5286] Al operation 3330, the authentication module 3010 may calculate the identity likelihood metric by correlating, matching, or otherwise comparing the real-time identification indications with the past identification Indications.
For example, if the real-time data indicates a particular location, the authentication module 3010 may match the particular location with a frequent 25 location of the user to authenticate the identity of the user. Although the discussion of FIG. 33 is mostly directed to location data, the authentication module 3010 may, in a similar manner, use many other types of data included in the attribute data to calculate the identity likelihood metric.
[00287] Referring back to FIG. 31, at operation 3140, the activity' module 30 3020 may identify or infer a user activity being performed by the user based on the real-time data. In other words, the activity module 3020 may identify a user objective (e.g., logging into a website) being advanced by the user based on the real-time data. The user activity may include a wide variety of activities such as, for example, operasing a computer (e.g., logging into a website), jogging in the 70 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 park, walking tow ard the refrigerator, and streaming video or other media content. In an example embodiment, the activity module 3020 may identify the user activity based on sensor data and status data received from one or more user devices. The status data may indicate activity associated with a particular 5 device. For example, the user devices may include a mobile device that provides location data and a variety of other user devices that may provide the status data (e.g., a smart TV that indicates current operating status such as streaming a particular piece of media). In this example, the activity module 3020 may infer the user activity by analyzing the location data in conjunction with the status 10 data (e.g., the user may be proximate to the smart TV based on the location data, and the smart TV may indicate that it is streaming a video).
[8(5288] In another example, the activity module 3020 may infer, extract, or derive the status data for a particular device based on an analysis of sensor data corresponding to the particular device. For example, a mobile device may be 15 equipped with accelerometers that measure motion and provide motion data. If the motion data indicates that the device is not moving, the activity module 3020 may infer that a person may not be carrying the particular device. In this scenario, the activity module 3020 may not infer the user activity based on a device that is not currently being carried by the user. The above examples are 20 merely non- limiting examples of the activity module 3020 identifying or inferring the user activity based on the real-time data. ’The activity module 3020 may use many other portions of the real-time data in a variety of schemes to identify or infer the user activity.
[00289] At operation 3150, the activity module 3020 may augment, adapt, 25 or otherwise modify the user activity according to a user setting or user preference. In other wmrds, the activity module 3020 may augment an environment of the user based on a user preference to facilitate the user’s advancement towards the user objective. The environment of the user is intended to include, for example, user devices within the vicinity of the user. 30 For example, if the user activity includes an authorization task (e.g., logging into a website), the activity module 3020 may augment the user activity by adjusting a security level of the authorization task. In some example embodiments, the activity module 3020 may adjust the security level of the authorization task based on the identity likelihood metric. For example, if the identity likelihood 71 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 metric indicates strongly that the identity of the user is authenticated, then the security level may be reduced more than if the identity likelihood metric did not strongly indicate the identity of the user is authenticated. In some example embodiments, adjusting the security level of the authorization task may include 5 automatically performing the authorization task on behalf of the user. For example, if the authentication module 3010 has authenticated the identity of the user and the user activity includes accessing a particular website, then the activity module 3020 may automatically log the user into the particular website.
[00290] In a further specific example, the identified user activity may 10 comprise making a payment (e.g., an electronic payment to an e-commerce website corresponding to an item listing listed on the e-commerce website or a electronic payment to a merchant at a physical store). The activity module 3020 may augment the user activity associated with making a particular payment by facilitating the payment between the user and the payee. For instance, based on 15 the authentication of the identity of the user, the user may not need to provide a security credential or provide fewer security credentials to make the payment.
[00291] To assist in illustrating the above discussion, FIG. 34 depicts communication between a device of the user and the data mesh system 150, according to some example embodiments. In the illustration of FIG. 34, a user 20 3410 may be wearing one or more smart devices such as a smart watch 3420.
The smart watch 3420 may be communicatively coupled to the network 104 via various modalities. For example, the smart watch 3420 may be communicatively coupled to a network interface 3440, which, in turn, is communicatively coupled to the network 104. For instance, the smart watch 25 3420 may send signals 3430 that are received at the network interface 3440. In another example, the smart watch 3420 may be communicatively coupled to the network 104 without the network interface 3440. In addition, the networked system 102, including the data mesh system 150 and the augmentation system 280, may be communicatively coupled to the network 104. 30 [8(1292] Thus, the smart watch 3420 of the user 3410 may be communicatively coupled to the data mesh system 150 including the augmentation system 280. The data mesh system 150, including the augmentation system 280, may receive or access the attribute data corresponding to the smart watch 3420 via the network 104. Similarly, the data mesh system 72 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 150, including the augmentation system 280, may communicate or exchange data with the smart watch 3420 to facilitate augmentation of the user ac tivity such as communicating instructions to present a user interface. Although the example of FIG . 34 depicts an example smart watch 3420, it will be appreciated 5 that a wide variety of other devices may similarly be configured to interact with the data mesh system 150, [00293] FIGS. 35-38 are flow' diagrams illustrating further example operations of the method 3100 of FIG. 31, according to some example embodiments. Subsequent to the operation 3140, at the operation 3150, the 10 activity module 3020 may augment the user activity according to the user setting. Each of the flow diagrams of FIGS. 35-38 illustrates additional operations of the operation 3150. The additional operations of the operation 3150 include various example embodiments of augmenting the user activity according to the user setting. The following discussion merely describes non-15 limiting examples, and many other schemes and techniques may be employed by the augmentation system 280 to augment the user activity using the user setting.
[90294] in the flow diagram of FIG. 35, subsequent to the operation 3140, the settings module 3030 may determine the user setting based on the attribute data and the user activity at operation 3510. For example, the settings module 20 3030 may store a plurality of user settings in a storage device such as the database(s) 126. Subsequent to determining the user activity, the settings module 3030 may determine the user setting associated with or pertaining to the user activity. For instance, the user activity may include the user streaming a movie to a particular user device. Based on the activity module 3020 identifying 25 the activity of streaming the movie, the settings module 3030 may determine the user setting associated with augmenting the user activity of streaming the movie such as, for example, automatically pausing the streaming when the user leaves a vicinity of the particular user device presenting the movie to the user.
[90295] At operation 3520, the activity module 3020 may augment the user 30 activity according to the determined user setting. In continuing with the example above, the user activity may include streaming the movie. The activity module 3020 may augment the user activity according to the determined user setting.
For example, the activity module 3020 may automatically pause or otherwise halt the movie based on a trigger (e.g., the user leaving the vicinity or answering 73 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 a phone call), present the streaming movie to another display based on the user setting arid a location of the user, and so on. Although the above discussion is directed to streaming the movie, the activity module 3020 may augment the user activity that includes many other activities and based on many different types or 5 kinds of user settings.
[00296] In the flow diagram of FIG. 36, subsequent to the operation 3140, the characteristic module 240 may infer or measure directly user characteristics based on an analysis of at least a portion of the attribute data at operation 3610. As discussed above in connection with operation 610, the characteristic module 10 240 can infer a variety of user characteristics from the attri bute data using a variety of data. It will be understood that the discussion of operation 610 regarding the characteristic module 240 applies equally to operation 3610. [80297] At operation 3620, the settings module 3030 may determine the user setting based on the inferred user characteristics and the user activity. For 15 example, the inferred user characteristics may indicate the user has an affinity to view user interfaces on a largest screen available. In this example, the settings module 3030 may determine that the user setting includes presenting user interfaces to the largest screen available.
[00298] In further example embodiments, the settings module 3030 may 20 identify similar users to the user based on the attribute data or the user characteristics. For instance, the settings module 3030 may identify users associated with the same or similar demographic data as that of the user. In an example embodiment, the settings module 3030 may determine the user setting based on attribute data or characteristics corresponding to the similar users. For 25 example, the settings module 3030 may access the user characteristics corresponding to other users and correlate, match, or otherwise compare the user characteristics of the user with the user characteristics of other users to identify the similar users.
[00299] At operation 3630, the activity module 3020 may augment the user 30 activity according to the determined user setting. For example, if the user is viewing a particular user interface on a mobile device and the user is within a distance of a larger screen, such as a computer or smart TV, the activity module 3020 may augment the user activity by presenting the particular user interface on the larger screen. 74 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00300] In the flow diagram of FIG. 37, subsequent to the operation 3140, the activity module 3020 may determine that the user activity includes a presentation of a user interface to the user at operation 3710. For example, the presentation of the user interface may include the user viewing a movie, using a 5 website, or reading an email. The activity module 3020 may determine that the user activity includes the presentation of the user interface based on the real-time data. For instance, the status data of a particular device of the user may indicate that the particular device may be presenting the user interface to the user.
[003011 At operation 3720, the activity module 3020 may identify 10 presentation devices available to the user based on the attribute data that are capable of presenting the user interface to the user. For instance, the user maybe in a room that includes several user devices such as a smart TV, a laptop computer, and a mobile device. The activity module 3020 may identify these devices based on the real-time data. For example, if the smart TV is active and 15 connected to a network, the activity module 3020 may query the smart TV to determine whether it may be capable of presenting the user interface.
[00302] At operation 3730, the activity module 3020 may determine an alternate presentation device from among the identified presentation devices based on the user setting. The activity module 3020 may determine the alternate 20 presentation device based on a number of factors. For example, the user setting may indicate the user has an affinity to view a larger screen when available. Based on this user setting, the activity module 3020 may identify a presentation device with a larger display. In another example, the activity module 3020 may determine that the alternate presentation device should be within a close vicinity 25 of the user. For example, a particular presentation device that is out of view of the user may not be the best choice for the alternate presentation device. The activity module 3020 may make this determination based on location data included in the real-time data, in still another example, the activity module 3020 may identity a presentation device that is portable. For example, the activity 30 module 3020 may determine that the user is viewing a movie and that the user is leaving the vicinity, and may determine that a portable device to continue playing the movie is a desirable choice for the alternate presentation device, [00303] At operation 3740, the activity module 3020 may cause presentation of the user interface to the user w ith the alternat e presenta tion 75 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 device. In a specific example, the user may be viewing a live sporting event on a smart TV. When the user leaves the vicinity of the smart TV, the activity module 3020 may cause presentation of the live sporting event to the alternate presentation device such as another smart TV that is viewable to the user or a 5 portable dev ice of the user to allow the user to continue view the live sporting event despite being out of view of the initial presentation device.
[00304] In the flow diagram of FIG. 38, subsequent to the operation 3140, the ac ti vity module 3020 or the characteristic module 240 may determine a current location of the user based on the real- time data at operation 3810. For
10 example, the real-time data may include location data as determined by a GPS component of a mobile device, near-field beacon detections, and other location services.
[00305] At operation 3820, the activity module 3020 or the characteristic module 240 may access device location data included in the attribute data that 15 includes the real-time data. Similar to the operation 3810, the activity module 3020 may access, receive, or otherwise obtain the device location data based on GPS, near-field beacon detections, and other location sendees. Although FIG. 38 illustrates performing the operation 3810 prior to operation 3820, in alternative example embodiments, the operation 3810 may be performed 20 concurrently or after the operation 3820. For instance, the current location of the user and the device location data may be received, accessed, retrieved, derived, or otherwise obtained by the activity module 3020 at the same time in any order and the subsequent operations illustrated in FIG. 38 performed.
[00306] At operation 3830, the activity module 3020 may identify user 25 devices wi thin an operation distance of the user based on the current location of the user and the device location data. The operation distance may be configurable or dynamically determined. In some example embodiments, the operation distance may vary' from device to device. For example, the operation distance corresponding to a smart TV may be a reasonable distance such that the 30 user may view' the smart TV. In other examples, the operation distance corresponding to a mobile device may be a reasonable distance (e.g., arm’s length) such that the user may touch the mobile device.
[8(5307] At operation 3840, the activity module 3020 may augment the operation of the identified user devices according to the user setting. In some 76 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 example embodiments, the activity module 3020 may augment the operation of the identified user device according to the user setting and the user activity', For example, if the user activity includes the user moving into a smart kitchen, the user activity module 3020 may augment the operation of various user devices 5 according to the user setting based on moving into the smart kitchen. For instance, relevant notifications may be pushed to a mobile device of the user based on the user moving into or being in the smart kitchen (e.g., a notification regarding the status of a smart kitchen appliance). In another instance, the activity module 3020 may cause the smart appliance of the smart kitchen to 10 perform a task on behalf of the user automatically based on the user moving into the smart kitchen (e.g., automatically brew a cup of coffee).
[00308] To help illustrate the concepts described above, FIG. 39 illustrates non-limiting examples of augmenting the user activity, according to some example embodiment. A scene 3900 depicts a living room attached to an open 15 kitchen. The scene 3900 may include a media entertainment device 3910, a smart TV 3920, a lamp 3930, a mobile computer 3940, a mobile device 3950, a user 3960, a smart refrigerator 3970, and a kitchen display 3980. Each of the devices 3910-3950, 3970, and 3980 may be attribute sources coupled to a network (e.g., the network 104) and operable to communicate with the data mesh 20 system 150. In various example embodiments, the real- time data may include location data corresponding to the user. For instance, the user 3960 may he carrying a mobile device or another smart device (e.g., smart watch, NFC enable smart ring) that may provide real-time location data (e.g., as determined by a GPS component, beacon location detect, or other location sendees). In this way, 25 the location of the user 3960 may be tracked, monitored, or observed via a particular device the user 3960 is wearing, or the location of the user 3960 may be derived from various real-time data associated with the user’s location included in the attribute data (e.g., BLUETOOTH® handshakes between a device the user 3960 is wearing and a another device with a known or fixed 30 location). The activity module 3020 may infer the user activity based on the real-time data and augment the user activity based on the user setting.
[00309] In an example embodiment, the activity module 3020 may determine that the user 3960 may be streaming media content to the smart TV 3920 and may be moving away from the smart TV 3920 towards the kitchen. In 77 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 this example, the activity module 3020 may identify the mobile computer 3940 and the display 3980 as alternate presentation devices. The activity module 3020 may further determine that the kitchen display 3980 may be a desirable alternate presentation device as it may be within viewing distance of the user 3960. The 5 activity module 3020 may then augment the activity of the user 3960 by presenting the streaming media content on the kitchen display 3980 and halting the presentation to the smart TV 3920. In further example embodiments, the activity module 3020 may further determine that the user 3960 has opened the smart refrigerator 3970 and may pause the streaming of the media content while 10 the user 3960 may be using the smart refrigerator 3970.
[00310] in another example embodiment, the authentication module 3010 may authentication the identity of the user 3960 by detec ting a mobile de vice or wearable device the user 3960 may be carrying in proximity to the kitchen display 3980 (e.g., detected from the attribute data that may include a data feed 15 from the mobile device, the wearable device, and the kitchen display 3980).
Subsequently, the activity module 3020 may detect that the user 3960 is walking within an operating distance (e.g., a reasonable distance such as a distance that would place the user 3960 within the same room as a particular device) of a particular user device such as the kitchen display 3980. The activity module 20 3020 may detect the user 3960 within the operating distance of the kitchen display 3980 via a GPS component of a mobile device the user 3960 may be carrying, short range communication detections (e.g., BLUETOOTH®) or nearfield communication handshakes between the mobile device of the user 3960 and the kitchen display 3980), and so on. 25 [00311] The activity module 3020 may then present a personalized message to the user 3960 on the kitchen display 3980 that may be pertinent or relevant to the user 3960 with respect to the context or environment of the user 3960 as determined by the settings module 3030 via an analysis of the attribute data (e.g., the attribute data may include calendar information corresponding to the user 30 3960 or food inventory information as provided by the smart refrigerator 3970).
For instance, the personalized message may be a reminder regarding an upcoming appointment or a reminder to purchase a particular product the user 3960 may be tunning low' on. For a particular user other than the user 3960, the activity module 3020 may present a different personalized message or perform a 78 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 different augmentation of the user activity. Thus, in this example, the augmentation system 280 has received the attribute data associated with the user (e.g., the user 3960), authenticated the identity of the user with respect to the real-time data (e.g., by detecting a portable device corresponding to the user 5 3960), identified the user activity (e.g., walking near the kitchen display 3980), and augmented the activity' of the user according to the user setting (e.g., presenting the personalized message to the user 3960 on the kitchen display 3980).
[00312] In yet another example embodiment, the activity module 3020 may 10 detect the user may be in a vicinity' of the lamp 3930 (e.g., the user 3960 is in the same room as the lamp 3930) via location tracking of the user 3960 (e.g., a wearable device on the user’s person). In response to the activity module 3020 detecting the user 3960 near the lamp 3930, the activity module 3020 may' augment the environment of the user 3960 by' switching on the lamp 3930 or 15 changing a brightness of the lamp 39.30 (e.g., the lamp 3930 may be a smart lamp operable to execute various commands, or the lamp 3930 may be coupled to a smart outlet operable to control various functions of the lamp 3930). The acti vity module 3020 may adjust the operation of the lamp 39.30 according to the user setting corresponding to the lamp 3930 as determined by the settings 20 module 3030 (e.g., adjusting the brightness of the lamp 3930 according to historical brightness of the lamp 3930 corresponding to the user 3960).
[00313] FIG. 40 depicts an example user interface to facilitate augmentation of a user acti vity, according to another embodiment. In example embodiments, the activity' module 3020 may identify the user activity, such as the user moving 25 into a particular location (e.g., the kitchen). In some embodiments, the activity module 3020 may augment the activity of the user by presenting a user interface to the user. An example mobile device 4000 displaying an example notification 4010 is shown. The activity module 3020 may present notifications, such as the notification 4010, to the user that are relevant within the context of the user 30 activity. For instance, if the user has just entered the kitchen, it may be a good time to provide the user with information relevant to the kitchen, such as kitchen supplies, in an example embodiment, the attribute data may include data received from a smart refrigerator (e.g., smart refrigerator 3970) indicating a food supply. The acti v ity module 3020 may augment the activity' of the user by 79 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 presenting a notification 40i0 to the user regarding the food supply. In this example embodiment, the user may interact with the notification 4010 using user interface elements 4020 (e.g., place an order for the item or dismiss the notification). 5 [00314] FIG. 41 is a flow diagram 4100 illustrating various communications to facilitate the method of FIG. 31, according to example embodiments. At operation 4110, attribute source 4102 may communicate attribute data to the data mesh system 150. As described above with reference to FIG. 31, the data mesh system 150 may receive the attribute data at operation 3110, authenticate the ί 0 identity of the user at the operation 3 ί 30, identify the user activ ity at operation 3140, and augment the user activity at the operation 3150.
[00315] The data mesh system 150 may augment the user activity by communicating with a user device 4106. At operation 4120, the user device 4106 may facilitate augmentation of the user activity. For instance, the data 15 mesh system 150 may communicate instructions to the user device 4106 to present a particular user interface (e.g., a notification), and the user device 4106 may present the user interface in response.
[00316] At operation 4130, the user device 4106 may communicate data associated with a user action responsive to the user activity. The data associated 20 with the user action responsive to the augmentation of the user activity may indicate whether the augmentation is desired by the user. For instance, if the augmentation included presenting a notification to the user and the user dismisses the notification, then the settings module 3030 may use that as a basis for determining the user setting in subsequent analysis. 25 [00317] Similarly, at operation 4140, the attribute source 4102 may communicate attribute data associated with the user action responsi ve to the augmentation of the user activity. For instance, the data mesh system 150 may request, retrieve, or otherwise obtain the attribute data associate with the user action from the attribute source 4102 responsive to the augmentation of the user, 30 The attribute source 4102 may communicate the attribute data that indicates whether the augmentation was desired by the user.
[00318] At operation 4150, the settings module 3030 may infer an augmentation result from the data received from the user device 4106 and the attribute source 4102. The settings module 3030 or characteristic module 240 80 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 may identify a user action of the user based on the real-time data, the user action being in response to the augmented user activity or environment. For instance, the user action may include dismissing a notification or interacting with a user interface. The augmentation result may indicate whether the augmentation was 5 desired by the user. For instance, the attribute data may include engagement data that indicates the user's engagement in a particular activity, such as clicking on user interface elements. The settings module 3030 may infer the augmentation result based on the engagement data or other data. In further example embodiments, the activity module 3020 may store the augmentation 10 result in a storage device such as the databases 126 to be used subsequently to determine the user setting. Thus, the settings module 30.30 may evolve the user settings to better fit the user over time as more data associated with augmentations is received.
[00319'j Turning now to another example embodiment, FIG. 42 is a block 15 diagram of the visualization system 290, which may provide functionality to analyze the attribute data and generate a visualization based on the attribute data. The visualization system 290 may include an analysis module 4210, a commerce module 4220, and a visualization module 4230.
[00328] The analysis module 4210 in the visualization system 290 may 20 perform various analyses to facilitate the functionality described herein. For example, the analysis module 4210 may determine satisfaction of a reward criteria associated with the attribute data. In this example, the reward criteria may include a fitness goal and the analysis module 4210 may determine whether the user has satisfied the fitness goal based on an analysis of the attribute data. 25 Many other rewards and analyses may be performed by the analysis module 4210.
[00321] The commerce module 4220 may identify items from an e-commerce platform (e.g., the publication system 142). The items (e.g., item listings on an e-commerce website) are intended to include products, sendees, 30 activities, and the like. The commerce module 4220 may also retrieve item data associated with the identified items such as item price, seller, item location, seller location, item images, item description, and so on. In some example embodiments, the commerce module 4220 may facilitate the purchase, by the user, of the identified items. 81 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00322] The visualization module 4230 may generate a visualization based, at least in part, on the attribute data. The visualization may be representative of the attribute data. For example, the visualization module 4230 may generate an avatar that is representative of the attribute data. For example, the attribute data 5 may indicate demographic data corresponding to the user such as gender, age, height, and so on. The visualization module 4230 may generate the avatar based on the demographic data such as an avatar of the same gender and similar age, height, and so forth. The presentation module 210 may subsequently cause presentation of the generated visualization to the user. 10 100323] FIG. 43 is a flow diagram illustrating an example method 4300 for generating the visualization, according to some example embodiments. The operations of the method 4300 may be performed by components of the data mesh system 150 and the visualization system 290. At operation 4310, the attribute module 230 may receive the attribute data associated with the user from 15 a plurality of attribute sources. As will be discussed in connection with FIGS. 52 and 53, the attribute data may be received from a broad spectrum of attribute sources (c.g., devices, sensors, servers, databases, and other sources). Additionally, the attribute module 230 may receive the attribute data via many pathways resulting from an assortment of configurations of the attribute sources 20 as further discussed in connection with FIGS. 51A and 51B. In an example embodiment, the attribute module 230 may receive the attribute data directly from the attribute sources. In other example embodiments, the attribute module 230 may receive the attribute data from a central device that receives attribute data from a plurality of user devices. In still other example embodiments, 25 various user devices may be communicatively coupled in a decentralized deviee- to-device mesh, and the attribute module 230 may receive the attribute data corresponding to a particular device in the mesh from any of the devices in the mesh. The attribute module 230 may receive the attribute data front the attribute sources in many other configurations including various suitable combinations of 30 configurations.
[00324] In various example embodiments, the attribute module 230 may store the attribute data in association with the user (e.g., indexed based on a user identifier) for subsequent analysis. The attribute module 230 may store the attribute data in a storage device such as the database(s) 12.6, for example. The 82 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 attribute module 230 may access the stored attribute data using a variety of search or find schemes. For instance, the attribute data associated with a particular user may be accessed using a user identifier that corresponds to the particular user. It will he noted that the collective, aggregated attribute data may 5 be referred to as a “data mesh.” [00325] In various example embodiments, at least a portion of the attribute data may include real-time data or near real-time data. The term “real-time data,” as used herein, is intended to include data associated with an event currently happening. For example, the real-time data may include user input 10 data or sensor data communicated to the attribute module 230 after a delay interval (e.g., due to transmission delay or other delays such as being temporarily stored at an intermediate device) between capturing the data and the attribute module 230 receiving the data.
[00326] At operation 4320, the characteristic module 240 may infer or 15 measure directly a user characteristic or multipie user characteristics based on an analysis of at least a portion of the attribute data. As discussed above in connection with operation 610, the characteristic module 240 can infer a variety of user characteristics from the attribute data using a variety of data. It will be understood that the discussion of operation 610 regarding the characteristic 20 module 240 applies equally to operation 4320, [00327] In a specific example, the characteristic module 240 may infer the user’s physical dimensions based on the attribute data that may include purchase history. For instance, the characteristic module 240 may use demographic information such as age, gender, or location to filter clothing purchases included 25 in the purchase history' (e.g., filtering to identify clothing purchase intended for the user). Based on the filtered clothing purchase history, the characteristic module 240 may identify the user’s physical dimensions based on the garment sizes of the clothing purchases. In another specific example, the characteristic module 240 may infer the user’s fitness level based on fitness tracking software 30 included in a mobile device of the user. Thus, in these specific examples, the characteristic module 240 may' infer various physical characteristics or traits of the user based on the attribute data.
[00328] At operation 4330, the visualization module 4230 may generate a visualization based, at least in part, on the user characteristic. In some cases, the 83 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 term “visualization'5 as used herein is intended to include visual and non-visual components of a presentation (e.g., an animation including audio cued to the animation). The term “visualization” is also intended to include static images, animations, and other forms of visual presentation. 5 [8(5329] In an example embodiment, the visualization may comprise a chart or graph that may indicate a metric associated with the attribute data. For instance, the metric associated with the attribute data may be a completeness metric that indicates completeness of the attribute data associated with the user. That is to say, the completeness metric may indicate a quantity of attribute data 10 versus a goal quantity of attribute data or attainable quantity of attribute data (e.g., the completeness metric may indicate a quantity of the attribute data associated with the user is sixty percent of the way to a goal quantity of attribute data).
[00338] In another example embodiment, the visualization may comprise an 15 avatar that is representative of the user. For example, the avatar may be a person-like animation or image that may be intended to represent the user. The avatar does not necessarily need to resemble the user’s physical qualities or personality traits. However, in some example embodiments, the avatar may be intended to include qualities or traits that are similar or the same as qualities or 20 traits of the user. In other words, the avatar may be visually analogous to the user. Tire visualization module 42.30 may determine an avatar trait based, at least in part, on the inferred user characteristic and include the avatar trait when generating the avatar. In some example embodiments, the user characteristic may comprise a physical characteristic of the user, and the avatar trait may 25 comprise a representation of the physical characteristic. For example, the characteristic module 240 may infer various user characteristics such as physical dimensions of the user, demographic information, personality traits and the like. In this example, the physical dimensions may indicate a person who may be six feet tall, the demographic information may indicate a gender of female and an 30 age of twenty two, and the personality traits may indicate an outlandish tendency. As such, the avatar in this example may resemble a six foot tall woman and may include apparel that is congruent with having an outlandish tendency. Thus, the avatar may visually exemplify various characteristics of the user. 84 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [80331] In various example embodiments, the visualization module 4230 may employ a variety of schemes and techniques to determine an avatar trait based, at least in part, on the inferred user characteristic. In an example embodiment, the analysis module 4210 may identify similar users that are 5 similar to the user based on a variety of factors. In some example embodiments, the analysis module 4210 may access the attribute data or stored user characteristics corresponding to a plurality of other users. For example, the analysis module 4210 may identity the similar users from among the plurality of other users that are similar to the user based on the inferred users characteristics 10 of the user and respective user characteris tics of the plurality of other users. The analysis module 4210 may correlate, match, or otherwise compare the inferred user characteristics with respective user characteristics of the plurality of other users to identity the similar users. In various example embodiments, the analysis module 4210 may identify the similar users based on same or similar 15 demographic data (e.g., same or similar age, gender, location, etc.), same or similar user characteristics (e.g., same or similar brand purchases), same or similar attribute data, and so on. For instance, the analysis module 4210 may correlate the inferred user characteristics with respective user characteristics of other user to identify' the similar users. 20 [00332] Subsequent to the analysis module 4210 identifying the similar users, the visualization module 4230 may extract common characteristics from among the identified similar users. The visualization module 4230 may generate the visualization based on the extracted common characteristics. In the example above, the analysis module 4210 may identify particular similar users that are 25 associated with being outlandish. Continuing with this example, the visualization module 4230 may extract the common characteristics (e.g., a particular style of clothing or brand) from among the identified plurality of users. For instance, the common characteristic may be wearing a particular clothing color, style, brand, and so on. The visualization module 4230 may generate or 30 render the avatar to include a particular avatar trait that corresponds to the common characteristic (e.g., wearing a particular clothing brand).
[00333] In further example embodiments, the visualization module 4230 may apply weighting the inferred user characteristics and the extracted common characteristics in various schemes to generate the visualization based on the 85 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 inferred user characteristics or the extracted common characteristics. For example, a particular user characteristic that may be inferred from particular attribute data that correspond to a time further in the past may be weighted less heavily than a particular user characteristics interred from particular attribute 5 data that is more recent, the reasoning being that more recent data may be more relevant or pertinent to the objective of generating the visualization in a manner that accurately reflects the user or the attribute data. The visualization module 4230 may apply weighting using many other schemes, and the above is merely a non-limiting example. 10 [00334] in still further example embodiments, the visualization module 4230 may generate the visualization based, at least in part, on the real-time data included in the attribute data. For example, the characteristic module 240 may infer the user characteristics based, at least in part, on the real-time data, and the visualization module 4230 may generate the visualization based on the user 15 characteristics inferred from the real-time data. Thus, the visualization may be reflective of a current status of the user. In a specific example, the characteristic module 240 may infer that the user may currently be jogging vigorously at a park. The visualization module 4230 may, for example, generate the visualization, such as the avatar, to include a feature of perspiration indicati ve of 20 the user currently performing a vigorous physical activity. Thus, the visualization may be representative of a real-time status of the user. In a further example, the characteristic module 240 may infer an outfit the user is currently wearing (e.g., inferred from attribute data that may include detections from smart tags embedded into the user’s clothing) and the visualization module 4230 may 25 generate the avatar to include a representation of the inferred outfit.
[00335] At operation 4340, the presentation module 210 may cause presentation of the visualization to the user. For example, the visualization may comprise an avatar and the presentation may be displaying the avatar on a screen. The presentation module 210 causing presentation of the visualization 30 may include communicating the visualization, or instructions to present the visualization, to a device of the user operable to present the visualization to the user. In further example embodiments, the presentation module 210 may cause the presentation of the visualization to other users. For instance, the user may be associated with a profile, and viewers of the user’s profile may also view the 86 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 visualization. In other example embodiments, the user may be associated with contact users that have a connection with the user such as a social media-based relationship. In this example embodiment, the visual ization may be presented to the contact users. 5 [0Θ336] FIG. 44 is a flow' diagram illustrating further example operations 4400 of the example method 4300 of FIG. 43, according to some example embodiments. Subsequent to the operation 4340, at operation 4410, the presentation module 210 may receive user input indicating a change to the visualization. For example, the user input may indicate that the visualization is 10 based on user characteristics or attribute data that is not reflective of the user. In this example, the visualization may be an avatar, and the physical characteristics of the avatar may not be reflective of the user (e.g., the avatar is too short as compared to the user). 100337] At the operation 4420, the visualization module 42.30 may update 15 the visualization according to the change indicated by the user input. In the example above, if the user input indicated that the avatar was too short, the visualization module 4230 may generate or render the avatar with a taller height, [00338] In further example embodiments, the attribute module 230 may update or modify the attribute data according to the user input. For instance, if 20 the user input indicates demographic data (e.g., age) other than that currently' associated with the user, the attribute module 230 may update the demographic information according to the user input.
[00339] In still further example embodiments, the characteristic module 240 may infer the user characteristics based on an analysis of the attribute data and 25 the user input. For instance, if the user input indicates a particular clothing style, color, brand, and so forth, the characteristic module 240 may use that user input as a basis for inferring the user characteristics in conjunction with the attribute data, [00340] FIG. 45 is a flow diagram illustrating an example method 4500 tor 30 determining satisfaction of a reward criteria and providing a reward to the user according to some example embodiments. The operations of the method 4500 may be performed by components of the data mesh system 150 and the visualization system 290. At operation 4510, the analysis module 4210 may determine satisfaction of the reward criteria associated with the attribute data. 87 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016
The reward criteria may include a variety of criteria.
[00341] In an example embodiment, the reward criteria may include a criterion based on the completeness metric. In an example embodiment, the analysis module 4210 may determine the completeness metric based on an 5 analysis of the attribute data. The completeness metric may indicate a quantity of attribute data available to the data mesh system 150. In some example embodiments, the completeness metric may indicate the quantity of attribute data in comparison to a goal quantity of attribute data or attainable quantity of attribute data (e.g., the completeness metric may indicate a quantity of the 10 attri bute data associated with the user is si xty percent of the way to a goal quantity of attribute data). For instance, the user may have provided the attribute data, permission to access portions of the attribute data, or consent to access portions of the attribute data via the management module 250 (e.g., the user may have provided permission to the management module 250 to allow the attribute 15 module 230 to access mobile sensor data but not social networking data). In this instance, the completeness metric may indicate that portions of the attribute data may not be available to the attribute module 230. The analysis module 4210 may determine satisfaction of the criterion based on the completeness metric if the completeness metric exceeds a threshold. The threshold may be predefined 20 or dynamically determined by the analysis module 4210 based on various statistical analyses.
[00342] In a further example embodiment, the completeness metric may be associated with a specified type of attribute data. In this further example embodiment, the analysis module 4210 may determine the criterion based on the 25 completeness metric being satisfied if the user provides the specified type of attribute data or permission to access the specified type of attribute data.
[00343] In another example embodiment, the reward criteria may include a criterion based on a quality metric. In this example embodiment, the analysis module 4210 may determine the quality metric based on an analysis of the 30 attribute data. The quality metric may indicate a relevance or pertinence of the attribute data. For instance, older attribute data may be less relevant than newer attribute data. In an example embodiment, the quality metric may be higher for new attribute data and lower for older attribute data. Thus, a particular user associated with attribute data that is constantly updated may be associated with a 88 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 higher quality metric. The analysis module 4210 may determine satisfaction of the reward criteria that includes the criterion based on the quality metric exceeding a threshold. That is to say, the analysis module 4210 may determine satisfacti on of the reward criteria that includes a particular criterion based on the 5 quality metric by providing recent data, for example. The threshold may be predefined or dynamically determined by the analysis module 4210 based on various statistical analyses.
[00344] In yet another example embodiment, the reward criteria may include a criterion associated with completing a task. For instance, the task may 10 include the user recommending or communicating (e.g., email, text message) to other users regarding a product or application. The presentation module 210 may facilitate the user in performing the task (e.g., automatically determining available contact that may be contacted and providing a predetermined message that may be sent via a user interface provided by the presentation module 210). 15 In other instances, the task may include a specified goal. In this instance, the goal may, for example, be a fitness goal such as a number of steps taken in a day (e.g., as determined a pedometer app executing on a mobile device of the user). Continuing with this instance, the analysis module 4210 may determine satisfaction of the reward criteria including a criterion based on a number of 20 steps taken if the user exceeds a threshold number of steps.
[00345] In further example embodiments, the analysis module 4210 may determine satisfaction of the reward criteria including various criteria (e.g., criterion based on the completeness metric or the quality metric) by comparing metrics associated with the user to other users. As described above, the analysis 25 module 4210 may identify the similar users that are similar to the user based on a variety of factors. The analysis module 4210 may determine satisfaction of the reward criteria by comparing the various metrics associated with the user to the various metrics associated with the similar users. In a specific example, the similar users may include users that may have the same or similar demographic 30 data (e.g., age, gender, location). Among these similar users, the analysis module 4210 may determine an average completeness metric or another statistic-based value. The analysis module 4210 may compare the completeness metric of the user to the average completeness metric or another statistic-based value to determine satisfaction of a particular criterion associated with the completeness 89 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 metric of the user (e.g., the user may satisfy the reward criteria by being abo ve average as compared to similar users). Similarly, the analysis module 4210 may compare a fitness goal of a user to the similar users that may be of a similar fitness level to the user. The analysis module 4210 may employ many other 5 comparisons of the similar users or other users to determine satisfaction of the reward criteria. Thus, in some example embodiments, the analysis module 4210 may determine satisfaction of the reward criteria based on the attribute data associated with the identified similar users.
[80346] At the operation 4520, the analysis module 4210 may provide a 10 reward to the user based on the determined satisfaction of the reward criteria.
The reward may include additional visualization features or functions. For instance, the reward may include providing the user the ability to further customize the visualization (e.g., modify apparel of the avatar). In another instance, the reward may provide the user with additional features such as the 15 ability to share the visualization with other users. The reward may include many other features and functions related to the visualization.
[88347] In further example embodiments, the reward may include a coupon, deal, or other incentive. The reward may ineentivize the user to provide consent, permission, or access to additional attribute data, provide higher quality, more 20 relevant attribute data, complete various marketing tasks, complete various goals (e.g., a fitness goal), and so forth.
[00348] FIG. 46 is a flow diagram illustrating additional operations of the method 4300 according to some example embodiments. Subsequent to the operation 4320, at the operation 4330 the visualization module 4230 may 25 generate the visualization based at least in part on the user characteristics, in addition, at operation 4610, the commerce module 4220 may identify an item listing based on the user characteristics. For instance, the user characteristics may indicate the user’s preferences for clothing, electronics, and so on. Further, the attribute data may include purchase history data that the commerce module 30 4220 may use to determine products already owned by the user. By analyzing this information, the commerce module 42.20 may identify an item listing of interest to the user (e.g., an item listing on an e-commerce website). The commerce module 4220 may employ a variety of schemes and techniques using the user characteristics and the attribute data to identify the item listing. 90 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [00349] At the operation 4620, the visualization module 4230 may generate the visualization including the identified item associated with the identified item listing. For example, the visualization generated by the visualization module 4230 may include the avatar that may be representative of the user. In this 5 example, the visualization module 4230 may generate the avatar to include the avatar wearing or using a suitable item associated with the identified item listing. The commerce module 4220 may access item data associated with the identified item associated with the identified item listing. For example, the commerce module 4220 may access the item data t hat may include images of the item, 10 physical dimensions of the item (e.g., garment sizes), and so forth. Based on the item data, the visualization module 4230 may generate the visualization to include a representation of the identified item. The representation may be similar to the item in that it may include similar characteristics to the identified item. For example, the identified item may be a particular piece of apparel. In 15 this example, the visualizati on module 4230 may render a representation of the piece of apparel that is of the same or similar dimensions, color, pattern, and so forth.
[80350] in further example embodiments, the item associated with the identified item listing may be highlighted or otherwise emphasized in the 20 rendering of the avatar. In some example embodiments, the user may interact with the generated item rendering associated with the identified item listing included in the avatar (e.g., interacting with the item may cause recommendation for sale of the item listing).
[00351] FIG. 47 is a flow diagram illustrating an example method 4700 for 25 generating the visualization based on the attribute data according to some example embodiments. The operations of the method 4700 may be performed by components of the data mesh system 150 and the visualization system 290.
In an example, embodiment, at operation 4710, attribute source 4702 may communicate the attribute data to the attribute source 4704. At the operation 30 4715, the attribute source 4704 may receive the attribute data from the attribute source 4702. At operation 4720, the attribute source 4704 may communicate the attribute data to the data mesh system 150. As discussed above in connection with FIG. 43, at the operation 4310, the data mesh system 150 may receive the attribute data from the attribute source 4704. In this example embodiment, the 91 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 attribute data may be exchanged between the attribute source 4702 and the attribute source 4704, In this way, the data mesh system 150 may access various attribute data corresponding to a particular attribute source without directly communicating with the particular attribute source. 5 [8(5352] As discussed above in connection with FIG. 43, at the operation 4320, the characteristic module 240 may infer a user characteristic. At the operation 4330, the visualization module 4230 may generate the visualization based, at least in part, on the user characteristic. At the operation 4340, the presentation module 210 may cause presentation of the visualizat ion to the user. 10 The presentation module 210 may cause presentation of the visualization by communicating the visualization to the user device 4706. At operation 4725, the user device 4706 may present the visualization to the user. For example, the user device 4706 may be a mobile device of the user, and the presentation may be displaying the visualization on a screen of the mobile device. Subsequent to 15 presenting the visualization to the user, at operation 4730, the user device 4706 may receive user input from the user. In some example embodiments, the user input may result from an interaction with the presented visualization. At the operation 4735, the user device 4706 may communicate the user input to the data mesh system 150. For example, the user input may be received by the 20 presentation module 210 of the data mesh system 150.
[00353] As discussed above in connection with FIG. 44, at the operation 4410, the presentation module 210 may receive the user input indicating a change to the visualization. At the operation 4420, the visualization module 4230 may update the visualization according to the change indicated by the user 25 input. Thus, FIG. 47 has shown various communications or interactions between devices according to some example embodiments.
[00354] FIGS. 48, 49, 50A, and SOB depict example user interfaces for presenting the visualization to the user. Although FIGS. 48, 49, 50A, and SOB depict specific example visualization and user interface elements, these are 30 merely non-limiting examples and many other alternate visualization and user interface elements may be generated by the presentation module 210 and presented to the user. It will be noted that alternate presentations of the displays of FIGS. 48, 49, 50 A, and SOB may include additional information, graphics, options, and so forth; other presentations may include less information, or may 92 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 provide abridged information for easy use by the user.
[00355] FIG, 48 depicts an example device 4800 displaying an example user interface 4810 to present the visualization to the user. In an example embodiment, the visualization may be an avatar 4820 that is based on the 5 inferred user characteristics. In a specific example, the approximate physical size of the user may be deri ved from purchase history data such as clothing sizes, user input (e.g., user input to a fitness app that requests the user’s size to make various calculations), and so on. The user characteristics may include style characteristics extracted, derived, or inferred from the attribute data (e.g., type of 10 clothing purchased, types of activities the user engages in, and so on). In other example embodiments, the avatar 4820 may he used as a virtual fitment gauge to determine how particular apparel may appear on the person. Although the visualization of FIG. 48 depicts the avatar 4820, many other varieties of visualization may be rendered by the visualization module 4230 and presented to 15 the user by the presentation module 210.
[00356] In some example embodiments, the user may have provided interests and other information to the data mesh system 150, as depicted by user interface element 4830. In some example embodiments, the user may modify access permission to the user information, for example, by activating user 20 interface element 4840. The user may also edit or modify the attribute data, for example, by activating user interface element 4850. In further example embodiments, the user may be provided with recommendations that are based on an analysis of the attribute data or the user characteristics. For example, activating user interface element 4860 may display various personalized 25 recommendations.
[00357] FIG. 49 depicts an example device 4900 displaying an example user interface 4910 that may present the visualization to the user. The example user interface 4910 may include recommended items or allow the user to provide the user input to alter the visualization. For example, user interface element 30 4930 may include a plurality of recommended items such as user interface element 4940. The user may activate (e.g., dragging a user interface element onto the area occupied by the avatar 4920) a particular recommended item to indicate an interest in the particular recommended item. The visualization may be updated or otherwise modified responsive to the user activating the particular 93 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 recommended item. For example, the recommended item may be visually incorporated into the visualization such as the avatar 4920 may be shown to wear the recommended item when suitable. In further example embodiments, the user may provide indications interests and other information. For instance, user 5 interface element 4950 may include a plurality of user interests such as interest 4960. In an example embodiment, the user may select an interest from a plurality of interests. Based on the selected interest, the visualization module 4230 may modify the visualization. In still further example embodiments, the characteristic module 240 may incorporate the selected interest into the analysis 10 to determine the user characteristics.
[00358] FIG. 50A depicts an example device displaying an example user interface 5010 to present the visualization to the user. Similarly, FIG. 50B depicts an example device displaying an example user interface 5050 to present the visualization to the user. In an example embodiment, as more attribute data 15 is provided to the data mesh system 150, the more detailed and accurate the visualization module 4230 may render the visualization representative of the attribute data. For example, the user may be a college aged male with an athletic build. In this example, avatar 5020 of FIG. 50A may be a less detailed and less accurate representation of the user than avatar 5060 of FIG. 50B. The avatar 20 5020 may be an initial representation of the attribute data and the avatar 5060 may be a subsequent representation of the attribute data after the data mesh system 150 receives more attribute data from the user allowing the visualization system 290 to more accurately represent the user.
[00359] FIGS. 51A and 5IB depict example configurations for 25 communicatively coupling attribute sources, according to some example embodiments. The example embodiments described herein can access a vast and rich “Internet of Things” (IoT) dataset that is predominantly provided via. connected, interconnected, or otherwise communicatively coupled machines and devices that can include a multitude of sensors. In example embodiments, 30 devices and machines that provide the attribute data, such as the attribute sources, can be communicatively coupled in many different configurations. For instance, each attribute source is communicatively coupled to the networked system 102 independently to provide the networked system 102 access to the attribute data corresponding to each of the communicatively coupled attribute 94 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 sources. FIGS. 51A and 5IB depict alternative example attribute source configurations. It will be appreciated that FIGS. 51A and 5 IB are merely non-limiting examples of attribute source configurations and many other configurations or suitable combinations of configurations can be employed. 5 [00368] FIG. 51A depicts an example embodiment that includes attribute sources 5110 communicatively coupled in a decentralized device-to-device mesh. In this example embodiment, the attribute data corresponding to a particular device in the mesh may be received from any one or more of the devices in the mesh. For instance, the networked system 102 may access the 10 attribute data corresponding to attribute source E via attribute source H or a combination of attribute sources H and I in FIG. 51 A. In an example embodiment, the attribute source H or I may aggregate and store the attribute data corresponding to attribute sources A-F in FIG, 51 A. In some example embodiments, the networked system 102 tnav access the attribute data associated 15 with attribute source E by communicating with attribute source H or I in FIG. 51 A.
[80361] FIG. 51B depicts another example embodiment that may include attribute sources 5120 communicatively coupled to a central attribute source (e.g., attribute source H in FIG. 5IB). The networked system 102 may access 20 the attribute data associated with attribute sources A-G via the central attribute source in FIG. 5 IB. In some embodiments, the central attribute source may aggregate and store the attribute data received or accessed from the attribute sources A-G and provide a centralized access point for the attribute data associated with all, or some, of the communicatively coupled attribute sources 25 A-G in FIG. 5IB.
[88362] FIG. 52 depicts example sources 5200 including attribute sources 5210, according to some example embodiments. In various example embodiments, the attribute data may include data received, retrieved, or accessed from the attribute sources 5210. For example, the attribute sources 5210 may 30 provide data including everything from a moisture level of a houseplant to a dribbling rhythm of a basketball. In some embodiments, the attribute data corresponding to the attribute sources 5210 may be received or accessed in real time or near real time. For instance, the attribute sources 5210 may communicate or otherwise provide access to the attribute data as it becomes 95 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 available. In example embodiments, the attribute sources 5210 may include user device sources 5220, user data sources 5230, transportation sources 5240, materials sources 5250, third party sources 5260, home sources 5270, and a variety of other sources. As will be discussed in connection with FIG. 53, the 5 attribute sources 5210 may be associated with a wide variety of sensors, gauges, measurement components, and other components.
[00363] In an example embodiment, the attribute data may include data corresponding to the user device sources 5220. The user device sources 5220 may include such non -limiting examples as a personal computer (PC), a tablet 10 computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), and other smart devices. As will be discussed further in connection with FIG. 53, the attribute data corresponding to the user 15 device sources 5220 may include data associated with sensors, gauges, or other measurement components such as environmental sensor data (e.g., ambient temperature data associated with an environment of the user), biometric sensor data (e.g., heart rate data of the user), detection data (e.g., detection of a Near Field Communication (NFC) beacon), motion data (e.g., acceleration data), 20 position data (e.g., location as determined by a GPS of a mobile device), and so forth.
[00364] In further example embodiments, the attribute data corresponding to the user device sources 5220 includes data such as device type, device model, device name, a unique device identifier, and other device parameters. In some 25 example embodiments, the device type data provides a basis for an inference associated with the attribute data. For instance, if the device type data indicates that the device is a mobile device of the user, location data corresponding to the mobile device may indicate the location of the user. Similarly, if the device type is a media entertainment system, the attribute data corresponding to the media 30 entertainment system may be associated with a home of the user.
[00365] The user data sources 5230 include, for example, calendars (e.g., user calendar events such as birthdays, trips, exams), user profiles (e.g., demographic information such as age, gender, income level), purchase histories, browse histories (e.g., search terms), social media content (e.g., check-ins, posts, 96 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 connections); or other user data (e.g., bookmarked websites, preferences or settings for various applications, application usage data such as time spent using a particular application). The attribute data corresponding to the user data sources 5230 is stored, for example, by the user device sources 5220 (e.g., a 5 mobile device that includes a mobile browser with browse history of the user), application server(s) 140 (e.g., payment history of the user stored in payment system(s) 144, user profiles stored by an ε-commerce website), or the third party server(s) 130 (e.g., social media data stored in a social networking service). For instance, the attribute data corresponding to the user device sources 5220 10 includes device resource data. In some implementations, the device resource data includes files stored on the devices (e.g., digital media or apps) or metadata associated with the files (e.g., the number of times a particular song has been played or usage time corresponding to a particular app).
[00366] As cars and other forms of transportation become increasingly 15 equipped with sensors and the ability to communicate, a vast amount of data may be provided by the transportation sources 5240. For example, the attribute data corresponding to the transportation sources 5240 may include acceleration data, velocity data, and other sensor data (e.g., brake pad wear data, gear shifting data, miles driven). In this example, the attribute data corresponding to the 20 transportation sources 5240 may provide indications of a user’s driving patterns and styles (e.g., comes to a complete stop at a stop sign, speeds, or finicky use of the brakes).
[00367] The materials sources 5250, such as clothing and structures, are also increasingly gaining the ability to capture data. In various example 25 embodiments, the attribute data may include data corresponding to the materials sources 5250. For example, clothing may be embedded with sensors to detect motion. Data from these sensors may provide indications of whether the user is active or inactive. In another example, clothing may be embedded with biometric sensors that may provide a continuous feed of biometric data 30 corresponding to the user. The biometric data may provide indications of the user’s health, athletic ability, and many other characteristics corresponding to the user. Similarly, structures may be equipped with sensors to passively or actively monitor the surrounding environment (e.g., street cameras, traffic cameras, and other sensors). 97 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 [80368] In example embodiments, the attribute data may include data associated with the third party sources 5260, The third party sources 5260 may also provide an abundance of data associated with the user. For instance, the attribute data may include data accessed from government websites or other 5 public records that may provide criminal histories, civil citation histories, credit histories, or other publicly available information, [80369] In various embodiments, a smart home is a house, office, or other environment of the user with one or more smart devices integrated throughout. Nearly e very facet of the smart home may provide data associated with the user 10 (e.g., via the smart device acting as sensors to provide various data). In some implementations, the attribute data includes data corresponding to the home sources 5270. For instance, the home sources 5270 may include smart appliances, consumables, utilities, and many other smart devices. In a few specific instances, the attribute data may include consumable inventories and 15 consumpti on rates of various consumable goods (e.g., perishables such as milk or bread) tracked, monitored, or otherwise observed by smart refrigerators. In another instance, the attribute data may include utility' usage data (e.g., electricity, water). Analysis of the utility usage data may indicate patterns or a status of the user, such as the user being on vacation, the user being ill (e.g., 20 increasing house thermostat set temperature to cope with a cold), the user being an energy conscious consumer, and so on.
[00370] Referring now' to FIG. 53, example diagram 5300 depicts nonlimiting example I/O components 5310 that may provide attribute data, according to some example embodiments. In example embodiments, the I/O 25 components 5310 include input components 5320, output components 5330, environmental components 5340, motion components 5350, position components 5360, biometric components 5370, communication components 5380, detection components 5390, and a wide gamut of other sensors, gauges, and measurement components not shown in FIG. 53. The I/O components 5310 30 or a suitable combination of the I/O components 5310 may be included in any suitable device or machine such as those included in the attribute sources 5210 depicted in FIG. 52 to facilitate the functionality' described herein.
[80371] The I/O components 5310 may receive, detect, measure, capture, or otherwise obtain sensor data associated with physical properties, attributes, or 98 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 characteristics. The i/O components 5310 may provide, produce, transmit, or otherwise communicate the sensor data or other indications associated with the physical properties, attributes, or characteristics (e.g., a sensor included in a device operable to communicate the sensor data to the networked system 102). 5 In some implementations, a combination of devices may be employed to provide the sensor data (e.g., a first device that includes a sensor and is communicatively coupled to a second device that communicates sensor data received from the first device to the networked system 102). As a result, the sensor data provided by the I/O components 5310 may be accessible to all, or some, of the modules 10 described above on a real-time or near real-time basis. The I/O components 5310 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting.
[00372] The input components 5320 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric 15 input, a photo-optical keyboard, or other alphanumeric input components), point- based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., 20 a microphone), and the like. In some implementations, the input components 5320 receive input from the user to facilitate the functionalities described herein. For instance, the user may interact with a user interface using the input components 5320.
[90373] The output components 5330 include visual components (e.g., a 25 display such as a plasma display panel (PDF), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal generators, and so forth. The output components 5330 may present information to the user. For example, the output components 5330 may 30 present a user interface to the user or present media files to the user.
[00374] The environmental components 5340 include illumination sensors (e.g., photometer), temperature sensors (e.g., one or more thermometers that detect ambient temperature), humidity sensors, pressure sensors (e.g., barometer), acoustic sensors (e.g., one or more microphones that detect 99 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 background noise), proximity sensors (e.g., an infrared sensor that detects nearby objects), gas sensors (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), and so on. The environmental components 5340 5 may measure various physical parameters to provide an indication or signal corresponding to the physical environment surrounding the environmental components 5340.
[00375] The motion components 5350 include acceleration sensors (e.g., accelerometer), gravitation sensors, rotation sensors (e.g., gyroscope), and so 10 forth. The motion components 5350 may provide motion data such as velocity, acceleration, or other force measurements along x, y, and z axes. In some implementations, the motion data is provided at a regular update rate or sampling rate (e.g., 10 updates per second) that may be configurable.
[00376] The position components 5360 include location sensors (e.g., a 15 Global Positioning System (GPS) receiver component), altitude sensors (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensors (e.g., magnetometers that provide magnetic field strength along the x, y, and z axes), and the like. In an example embodiment, the position components 5360 may provide position data such as latitude, longitude, 20 altitude, and a time stamp. Similar to the motion components 5350, the position components 5360 may provide the motion data at a regular update rate that may be configurable.
[80377] The biometric components 5370 include components to detect expressions, measure biosignals, or identify people, among other functions. For 25 example, the biometric components 5370 include expression components to detect expressions (also referred to as “kinesics”) such as hand gestures (e.g., an optical component to detect a hand gesture or a Doppler component to detect hand motions), vocal expressions (e.g., a microphone to detect changes in voice pitch that may indicate tension), facial expressions (e.g., a camera to detect 30 expressions or micro-expressions of a person such as a smile), body gestures, and eye tracking (e.g., detecting the focal point of a person’s eyes or patterns in eye movement). The biometric components 5370 may also include, for example, biosignal components to measure biosignals such as blood pressure, heart rate, body temperature, perspiration, and brain waves (e.g., as determined by a 100 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 electroencephalogram). In further examples, the biometric components 5370 include identification components to identify people such as retinal scanners (e.g., a camera component), vocal detectors (e.g., a microphone to receive audio data for voice identification), facial detectors, fingerprint detectors, and 5 electroencephalogram sensors (e.g., to identify a person via unique brainwave patterns).
[00378] Communication may be implemented using a wide variety of technologies. The I/O components 5310 may include communication components 5380 operable to communicatively couple machines or devices. For 10 example, the communication components 5380 may include a network interface component or other suitable device to interface with a network (e.g., the network 104). In further examples, the communication components 5380 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) 15 components, BLUETOOTH® components (e.g., BLUETOOTH® Low Energy), WI-FT© components, and other communication components to provide communication via other modalities, in addition, a variety of information may be derived using the communication components 5380 such as location via Internet Protocol (IP) geo-location, location via WI-FI® signal triangulation, 20 location via detecting a NFC beacon signal that may indicate a particular location, and so forth.
[00379] The detection components 5390 provide functionality to detect a variety of identifiers. For example, the detection components 5390 include Radio Frequency identification (RFID) tag reader components, Near Field 25 Communication (NFC) smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar codes, multi-dimensional bar codes such as a Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology 30 (UCC RSSV2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In additional, a variety of information may be derived via various communication components such as location via Internet Protocol (IP) geo-location, location via WI-FI®) signal iriangulation, location via detecting a NFC beacon signal that may 101 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 indicate a particular location, and so forth.
[80380] FIG, 54 is a block diagram 5400 of an example data structure for the attribute data associated with a particular user according to example embodiments. In embodiments, the attribute data is associated with a plurality 5 of users such as user 5402, 5404, 5406, 5408, 5410, 5412, and 5414. In an embodiment, the attribute data is accessed for a particular user via a lookup using a user identifier. The attribute data includes, for example, profile data 5420, device data 5422, calendar data 5424, list data 5426, list type data 5428, interest data 5430, fitment data 5432, garment type data 5434, preference data 10 5436, measured dimension data 5438, fitness goal data 5440, reward data 5442, location data 5444, and other data not shown in FIG. 54. In some embodiments, the attribute data may be structured such that various portions of the attribute data are associated with other portions of the attribute data via relationships. For instance, the calendar data 5424 may include a calendar event associated with an 15 event name, an event data, and an event location for the calendar event.
[88381] FIG. 55 is a block diagram 5500 of an example data structure for data associated with a device according to some example embodiments. In an example embodiment, the device data 5422 of FIG. 54 may include a device identifier, a device name, device resources data (e.g., files stores on the devices 20 such as browser cookies, media files), I/O component data, and so forth. In example embodiments, the device identifier comprises, for example, an Internet Protocol (IP) address, a Media Access Control (MAC) address, other unique identifiers, an International Mobile Station Equipment Identity (ΊΜΕΙ), or a Mobile Equipment Identifier (MEiD). in an embodiment, the I/O component 25 data includes standard device parameters 5502, position data 5504, location data 5506, motion data 5508, environmental data 5510, biometric data 5512, among other data. FIG. 55 merely depicts example attribute data that may correspond to a particular device, and a variety of other data not shown in FIG. 55 may be included in the device data, in various embodiments, the standard device 30 parameters 5502 include parameters that are standard across multiple devices included in the IoT. in some embodiments, standardized parameters and protocols facilitate access and utilization of the attribute data corresponding to such devices. For example, the attribute data available on an unknown device may be accessed and utilized without the need to discover or otherwise 102 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 determine which parameters are available and which units of measure are associated with the parameters. Many other schemes may be employed to discover or otherwise determine available parameters accessible on a particular device. 5 [00382] Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules can constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of performing certain operations and can be configured or arranged 10 in a certain physical manner, in various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) is configured by software (e.g., an application or application portion) as a hardware module that operates to 15 perform certain operations as described herein.
[00383] In some embodiments, a hardware module is implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware module can include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module can 20 be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module can include software encompassed within a general-purpose processor or 25 other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) can be driven by cost and time considerations.
[00384] Accordingly, the phrase “hardware module” should be understood 30 to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are 103 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor 5 may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software can accordingly configure a particular processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time. 10 [00385] Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules can be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications can be achieved through signal transmission (e.g., over appropriate circuits and buses) 15 between or among two or more of the hardware modules, in embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one 20 hardware module performs an operation and stores the output of that operation in a memory device to which it is communicatively coupled. A further hardware module can then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules can also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of 25 information).
[00386] The various operations of example methods described herein can be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors 30 constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented module” refers to a hardware module implemented using one or more processors.
[00387] Similarly, the methods described herein can be at least partially 104 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method can be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance 5 of the relevant operations in a ‘"cloud computing” environment or as a “software as a serv ice” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application 10 Program Interface (API)).
[00388] The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processors or processor-implemented modules are located in a single geographic location (e.g., 15 within a home environment, an office environment, or a server farm). In other example embodiments, the processors or processor-implemented modules are distributed across a number of geographic locations.
[00389] FIG. 56 is a block diagram 5600 illustrating an architecture of software 5602, which can be installed on any one or more of the devices 20 described above. FIG. 56 is merely a non-limiting example of a software architecture, and it will be appreciated that many other architectures can be implemented to facilitate the functionality described herein. In various embodiments, the software 5602 is implemented by hardware such as a machine 5700 of FIG. 57 that includes processors 5710, memory 5730, and I/O 25 components 5750. In this example architecture, the software 5602 can be conceptualized as a stack of layers where each layer may provide a particular functionality. For example, the software 5602 includes layers such as an operating system 5604, libraries 5606, frameworks 5608, and applications 5610. Operationally, the applications 5610 invoke application programming interface 30 (API) calls 5612 through the software stack and receive messages 5614 in response to the API calls 5612, consistent with some embodiments.
[00390] In various implementations, the operating system 5604 manages hardware resources and provides common services. The operating system 5604 includes, for example, a kernel 5620, services 5622, and drivers 5624, The 105 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 kernel 5620 acts as an abstraction layer between the hardware and the other software layers consistent with some embodiments. For example, the kernel 5620 provides memory management, processor management (e.g., scheduling), component management, networking, and security settings, among other 5 functionality. The services 5622 can provide other common services for the other software layers. The dri vers 5624 are responsible for controlling or interfacing with the underlying hardware, according to some embodiments. For instance, the drivers 5624 can include display drivers, camera drivers, BLUETOOTH© drivers, flash memory' dri vers, serial communication drivers 10 (e.g., Universal Serial Bus (USB) drivers), WI-FI® drivers, audio drivers, power management drivers, and so forth.
[00391] In some embodiments, the libraries 5606 provide a low-level common infrastructure utilized by the applications 5610. The libraries 5606 can include system libraries 5630 (e.g., C standard library) that can provide functions 15 such as memory/ allocation functions, string manipulation functions, mathematic functions, and the like. Irr addition, the libraries 5606 can include API libraries 5632 such as media libraries (e.g., libraries to support presentation and manipulation of various media formats such as Moving Picture Experts Group-4 (MPEG4), Advanced Video Coding (H.264 or AVC), Moving Picture Experts 20 Group Layer-3 (MP3), Advanced Audio Coding (AAC), Adaptive Multi-Rate (AMR) audio codec, Joint Photographic Experts Group (JPEG or JPG), or Portable Network Graphics (PNG)), graphics libraries (e.g., an OpenGL framework used to render in two dimensions (2D) and three dimensions (3D) in a graphic context on a display), database libraries (e.g., SQLite to provide 25 various relational database functions), web libraries (e.g,, WebKit to provide web browsing functionality), and the like. The libraries 5606 can also include a wide variety of other libraries 5634 to provide many other APIs to the applications 5610.
[00392] The frameworks 5608 provide a high-level common infrastructure 30 that can be utilized by the applications 5610, according to some embodiments.
For example, the frameworks 5608 provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks 5608 can provide a broad spectrum of other APIs that can be utilized by the applications 5610, some of which may be specific to a 106 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 particular operating system or platform.
[00393] In an example embodiment, the applications 5610 include a home application 5650, a contacts application 5652, a browser application 5654 , a book reader application 5656, a location application 5658, a media application 5 5660, a messaging application 5662, a game application 5664, and a broad assortment of other applications such as a third party application 5666.
According to some embodiments, the applications 5610 are programs that execute functions defined in the programs. Various programming languages can be employed to create one or more of the applications 5610, structured in a 10 variety of manners, such as object-oriented programming languages (e.g,,
Objeetive-C·, Java, or C++) or procedural programming languages (e.g., C or assembly language). In a specific example, the third party application 5666 (e.g., an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular 15 platform) may be mobile software running on a mobile operating system such as IOS™, ANDROID™, WINDOWS® PHONE, or another mobile operating system. In this example, the third party" application 5666 can invoke the API calls 5612 provided by the operating system 5604 to facilitate functionality described herein. 20 [00394] FIG. 57 is a block diagram illustrating components of a machine 5700, according to some embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform anyone or more of the methodologies discussed herein. Specifically, FIG. 57 shows a diagrammatic representation of the machine 5700 in the example form of a 25 computer system, within which instructions 5716 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 5700 to perform any one or more of the methodologies discussed herein can be executed. In alternative embodiments, the machine 5700 operates as a standalone device or can be coupled (e.g., networked) to other machines. In a 30 networked deployment, the machine 5700 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 5700 can comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set- 107 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, 5 or any machine capable of executing the instructions 5716, sequentially or otherwise, that, specify actions to be taken by the machine 5700. Further, while only a single machine 5700 is illustrated, the term “machine” shall also be taken to include a collection of machines 5700 that individually or jointly execute the instructions 5716 to perform any one or more of the methodologies discussed 10 herein.
[80395] In various embodiments, the machine 5700 comprises processors 5710, memory 5730, and I/O components 5750, which can be configured to communicate with each other via a bus 5702. In an example embodiment, the processors 5710 (e.g., a Central Processing Unit (CPU), a Reduced Instruction 15 Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RF.IC), another processor, or any suitable combination thereof) include, for example, a processor 5712 and a processor 5714 that may execute the 20 instructions 5716. The term “processor” is intended to include multi-core processors that may comprise two or more independent processors (also referred to as “cores”) that can execute instructions contemporaneously. Although FIG. 57 shows multiple processors, the machine 5700 may include a single processor with a single core, a single processor with multiple cores (e.g., a multi-core 25 processor), multiple processors with a single core, multiple processors with multiples cores, or any combination thereof.
[00396] The memory 5730 comprises a main memory' 5732, a static memory? 5734, and a storage unit 5736 accessible to the processors 5710 via the bus 5702, according to some embodiments. The storage unit 5736 can include a 30 machine-readable medium 5738 on which are stored the instructions 5716 embodying any? one or more of the methodologies or functions described herein. The instructions 5716 can also reside, completely or at least partially, within the main memory 5732, within the static memory 5734, within at least one of the processors 5710 (e.g., within the processor’s cache memory?), or any suitable 108 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 combination thereof, during execution thereof by the machine 5700. Accordingly, in various embodiments, the main memory 5732, the static memory 5734, and the processors 5710 are considered machine-readable media 5738. 5 [00397] As used herein, the term “memory'’’ refers to a machine-readable medium 5738 able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory' (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 5738 is shown in an example embodiment to be a 10 single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store the instructions 5716. The term “machine-readable medium” shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., 15 instructions 5716) for execution by a machine (e.g., machine 5700), such that the ins truc tions, when executed by one or more processors of the machine 5700 (e.g., processors 5710), cause the machine 5700 to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” refers to a single storage apparatus or device, as well as “cloud-based” 20 storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory (e.g., flash memory), an optical medium, a 'magnetic medium, other non-volatile memory' (e.g., Erasable Programmable Read-Only' 25 Memory (EPROM)), or any suitable combination thereof. The term “machine- readable medium” specifically excludes non-statutory signals per se.
[00398] The I/O components 5750 include a wide variety of components to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. In general, it will be appreciated 30 that the I/O components 5750 can include many other components that are not shown in FIG. 57. The I/O components 5750 are grouped according to functionality merely for simplifying the following discussion, and the grouping is in no way limiting. In various example embodiments, the I/O components 5750 include output components 5752 and input components 5754. The output 109 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 components 5752 include visual components (e.g., a display such as a plasma display panel (POP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor), other signal 5 generators, and so forth. The input components 5754 include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instruments), tactile input 10 components (e.g., a physical button, a touch screen that provides location and force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like.
[00399] In some further example embodiments, the TO components 5750 include biometric components 5756, motion components 5758, environmental 15 components 5760, or position components 5762, among a wide array of other components. For example, the biometric components 5756 include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignais (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a 20 person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 5758 include acceleration sensor components (e.g., an accelerometer), gravitation sensor components, rotation sensor components (e.g., a gyroscope), and so forth. The environmental components 25 5760 include, for example, illumination sensor components (e.g., a photometer), temperature sensor components (e.g., one or more thermometers that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., a barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared 30 sensors that detect nearby objects), gas sensor components (e.g., machine olfaction detection sensors, gas detection sensors to detect concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 110 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 5762 include location sensor components (e.g., a Global Positioning System (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. 5 [00400] Communication can be implemented using a wide variety of technologies. The I/O components 5750 may include communication components 5764 operable to couple the machine 5700 to a network 5780 or devices 5770 via a coupling 5782 and a coupling 5772, respectively. For example, the communication components 5764 include a network interface 10 component or another suitable device to interface with the network. 5780. In further examples, communication components 5764 include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, BLUETOOTH® components (e.g., BLUETOOTH® Low Energy), WI-FI® 15 components, and other communication components to provide communication via other modalities. The devices 5770 may be another machine or any of a wide vari ety of peripheral devices (e.g. , a peripheral device coupled via a Universal Serial Bus (USB)).
[00401] Moreover, in some embodiments, the communication components 20 5764 detect identifiers or include components operable to detect identifiers. For example, the communication components 5764 include Radio Frequency Identification (RFID ) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect onedimensional bar codes such as a Universal Product Code (UPC) bar code, multi-25 dimensional bar codes such as a Quick Response (QR) code, Aztec Code, Data
Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, Uniform Commercial Code Reduced Space Symbology (UCC RSS)-2D bar codes, and other optical codes), acoustic detection components (e.g., microphones to identify tagged audio signals), or any suitable combination thereof. In addition, a variety of 30 information can be derived via the communication components 5764, such as location via Internet Protocol (IP) geo-location, location via WI-FI® signal triangulation, location via detecting a BLUETOOTH® or NFC beacon signal that may indicate a particular location, and so forth.
[00402] In various example embodiment s, one or more portions of the 111 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 network 5780 can be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), die Internet, a portion of the Internet, a portion of the Public Switched 5 Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a WI-FI® network, another type of network, or a combination of two or more such networks. For example, the network 5780 or a portion of the network 5780 may include a wireless or cellular network, and the coupling 5782 may be a Code Division Multiple Access 10 (CDMA) connection, a Global System for Mobile communications (GSM) connection, or another type of cellular or wireless coupling. In this example, the coupling 5782 can implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (lxRTT), Evolution-Data Optimized (EVDO) technology. General Packet Radio Service 15 (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) 20 standard, others defined by various standard-setting organizations, other long range protocols, or other data transfer technology.
[00403] In example embodiments, the instructions 5736 are transmitted or received over the network 5780 using a transmission medium via a network interface device (e.g., a network interface component included in the 25 communication components 5764) and utilizing any one of a number of well- known transfer protocols (e.g., Hypertext Transfer Protocol (HTTP)). Similarly, in other example embodiments, the instructions 5716 are transmitted or received using a transmission medium via the coupling 5772 (e.g., a peer-to-peer coupling) to the devices 5770. The term “transmission medium” shall he taken 30 to include any intangible medium that is capable of storing, encoding, or carrying the instructions 5716 for execution by the machine 5700, and includes digital or analog communications signals or other intangible media to facilitate communication of such software.
[00404] Furthermore, the machine-readable medium 5738 is non-transitory 132 PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 (in other words, not having any transitory signals) in that it does not embody a propagating signal However, labeling the machine-readable medium 5738 “non-transitory” should not be construed to mean that the medium is incapable of movement; the medium should be considered as being transportable from one 5 physical location to another. Additionally, since the machine-readable medium 5738 is tangible, the medium may be considered to be a machine-readable device. A. carrier medium comprises a tangible machine readable medium storing machine readable instructions and a transient medium, such as a signal, carrying machine readable instructions. 10 [00405] Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order 15 illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structur e or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of 20 the subject matter herein.
[80406] Although an overview' of the inventive subject matter has been described with reference to specific example embodiments, various modifications and changes may be made to these embodiments without departing from the broader scope of embodiments of the present disclosure. 25 Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single disclosure or inventive concept if more than one is, in fact, disclosed. 180407] The embodiments illustrated herein are described in sufficient detail 30 to enable those skilled in the art to practice the teachings disclosed. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the PCT/US2015/022318 WO 2015/148559 2016101802 14 Oct 2016 appended claims, along with the full range of equivalents to which such claims are entitled.
[00408] As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for 5 resources, operations, or structures described herein as a single instance.
Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various 10 embodiments of the present disclosure. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and 15 impro vements fall within a scope of embodiments of the present disclosure as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. 114

Claims (20)

1. A system comprising: an activity module to detect a device activity being performed in real time by a user device of a user; an attribute module to access attribute data associated with the user from a plurality of attribute sources; a preference module to infer a user preference, from the attribute data, indicating a preference of the user for performing on a secondary user device an ancillary activity corresponding to the device activity; based on the inferred user preference, a device module to identify the secondary user device according to a device status of the secondary user device that indicates a device capability to perform the ancillary activity in real time; and the activity module to generate the ancillary activity, to be performed in real time on the secondary user device, by analysis of at least one of the device activity, a device functionality of the secondary user device, and the user preference, and further to cause performance of the ancillary activity on the secondary user device in real time.
2. The system of claim 1, wherein: the device module is adapted to receive, from the secondary user device, sensor data that represents a real-time physical environment of the secondary user device and further to determine the device status of the secondary user device by analysis of the sensor data.
3. A method comprising: detecting a device activity being performed in real time by a user device of a user; accessing attribute data associated with the user from a plurality of attribute sources; inferring a user preference, from the attribute data, indicating a preference of the user for performing on a secondary user device a complementary activity corresponding to the device activity; based on the inferred user preference, identifying, using a hardware processor of a machine, the secondary user device according to a device status of the secondary user device indicating a device capability to perform the complementary activity in real time; generating the complementary activity, to be performed in real time on the secondary user device, by analyzing at least one of the device activity, a device functionality of the secondary user device, and the user preference; and transmitting, to the secondary' user device, instructions to perform in real time the complementary activity.
4. The method of claim 3, further comprising: receiving, from the secondary user device, sensor data representing a real-time physical environment of the secondary user device; and determining the device status of the secondary user device by analyzing the sensor data.
5. The method of claim 4, further comprising: inferring a current user location based on location data received from the user device; extracting a current device location from the sensor data received from the secondary user device; comparing the current user location and the current device location to determine that the secondary user device is within a distance of the current user location; and based on the secondary user device being within the distance of the current user location, identifying the secondary user device as being capabl e of performing the complementary activity' in real time.
6. The method of claim 4, further comprising: calculating an active metric from the sensor data, the active metric being indicative of the secondary user device being in use by the user; determining that the secondary user device is active based on the active metric exceeding a threshold value; and based on the secondary user device being active, identifying the secondary user device as being capable of performing the complementary activity in real time.
7. The method of claim 3, further comprising: augmenting the device activity being performed in real time by the user device based on the generated complementary activity.
8. The method of claim 3, wherein the secondary device comprises a wearable computing device.
9. The method of claim 3, wherein the complementary activity comprises a portion of the device activity.
10. The method of claim 3, wherein the complementary'' activity comprises a notification including notification content that corresponds to the device activity.
11. The method of claim 3, further comprising: wherein the device functionality includes a display size corresponding to the secondary user device; determining the display size corresponding to the secondary user device is below a threshold size; and generating the complementary activity to include abridged activity content according to the display size corresponding to the secondary user device.
12. The method of claim 3, further comprising: comparing the de vice functionality of the secondary user de vice with a device functionality of the user device to identify non-mutual functionality of the secondary user device that is not available on the user device; and generating the complementary activity to include an activity component that utilizes the non-mutual functionality of the secondary user device.
13. The method of claim 12, wherein the non-mutual functionality of the secondary user device includes functionality to capture data from a sensor not available on the user device.
14. The method of claim 3, further comprising: identifying a plurality of user devices based on respective device statuses of the plurality of user devices, each device status of the device statuses indicating capability to perform a particular complementary activity in real time; and determining a suitable user device from among the plurality of user devices based on the user preference, the suitable user device being capable of performing the complementary activity in real time.
15. The method of claim 3, further comprising: inferring user characteristics pertaining to the user from the attribute data; identifying similar users, from among a plurality of other users, that are similar to the user based on the inferred user characteristics and respective user characteristics of the plurality of other users; and inferring the user preference based on the user characteristics of the identified similar users and the inferred user characteristics.
16. A machine-readable medium having storing instructions that, when executed by at least one processor of a machine, cause the machine to perform operations comprising: detecting a device activity being performed in real time by a user device of a user; accessing attribute data associated with the user from a plurality" of attribute sources; inferring a user setting, from the attribute data, indicating a preference of the user for performing on a secondary user device a companion activity corresponding to the device activity; based on the inferred user setting, identifying, using a hardware processor of a machine, the secondary user device according to a device status of the secondary user device indicating a device capability to perform the companion activity in real time; generating the companion activity, to be performed in real time on the secondary user device, by analyzing at least one of the device activity, a device functionality of the secondary user de vice, and the user setting; and causing performance of the companion activity on the secondary user device in real lime.
17. The machine-readable medium of claim 16, further comprising: receiving, from the secondary user device, sensor data representing a real-time physical environment of the secondary user device; and determining the device status of the secondary user device by analyzing the sensor data.
18. The machine-readable medium of claim 17, further comprising: inferring a current user location based on location data received from the user device; extracting a current device location from the sensor data received from the secondary- user device; comparing the current user location and the current device location to determine that the secondary user device is within a distance of the current user location; and based on the secondary user device being within the distance of the current user location, identifying the secondary user device as being capable of performing the companion activity in real time.
19. The machine-readable medium of claim 17, further comprising: calculating an active metric from the sensor data, the active metric being indicative of the secondary user device being in use by the user; determining that the secondary user device is active based on the active metric exceeding a threshold value; and based on the secondary' user device being active, identifying the secondary user device as being capable of performing the companion activity in real time.
20. A carrier medium carrying machine readable instructions that, when executed by at least one processor of a machine, cause the machine to cany out the method of any one of claims 3 to 15.
AU2016101802A 2014-03-25 2016-10-14 Data mesh platform Ceased AU2016101802A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2016101802A AU2016101802A4 (en) 2014-03-25 2016-10-14 Data mesh platform

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US61/970,263 2014-03-25
US14/449,126 2014-07-31
US14/449,113 2014-07-31
US14/498,326 2014-09-26
PCT/US2015/022318 WO2015148559A1 (en) 2014-03-25 2015-03-24 Data mesh platform
AU2016101802A AU2016101802A4 (en) 2014-03-25 2016-10-14 Data mesh platform

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/022318 Division WO2015148559A1 (en) 2014-03-25 2015-03-24 Data mesh platform

Publications (1)

Publication Number Publication Date
AU2016101802A4 true AU2016101802A4 (en) 2016-11-17

Family

ID=57351197

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2016101802A Ceased AU2016101802A4 (en) 2014-03-25 2016-10-14 Data mesh platform

Country Status (1)

Country Link
AU (1) AU2016101802A4 (en)

Similar Documents

Publication Publication Date Title
US11657443B2 (en) Data mesh based environmental augmentation
US10146860B2 (en) Biometric data based notification system
US20150379602A1 (en) Lead recommendations
US11507984B2 (en) Generating personalized banner images using machine learning
US20190019200A1 (en) Systems and methods for analyzing electronic messages for customer data
AU2016101802A4 (en) Data mesh platform
US12033204B2 (en) Device ancillary activity
US20170032420A1 (en) Publisher facilitated advertisement mediation

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry