CN112055865A - Systems, devices, and/or processes for behavioral and/or biological state processing - Google Patents

Systems, devices, and/or processes for behavioral and/or biological state processing Download PDF

Info

Publication number
CN112055865A
CN112055865A CN201980029470.1A CN201980029470A CN112055865A CN 112055865 A CN112055865 A CN 112055865A CN 201980029470 A CN201980029470 A CN 201980029470A CN 112055865 A CN112055865 A CN 112055865A
Authority
CN
China
Prior art keywords
content
behavioral
parameters
particular user
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980029470.1A
Other languages
Chinese (zh)
Inventor
蕾妮·玛丽·圣阿曼特
加里·戴尔·卡彭特
克里斯托弗·丹尼尔·埃蒙斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARM Ltd
Original Assignee
ARM Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/922,687 external-priority patent/US11259729B2/en
Priority claimed from US15/922,671 external-priority patent/US20190287028A1/en
Priority claimed from US15/922,644 external-priority patent/US10373466B1/en
Application filed by ARM Ltd filed Critical ARM Ltd
Publication of CN112055865A publication Critical patent/CN112055865A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/60ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to nutrition control, e.g. diets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/10ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to drugs or medications, e.g. for ensuring correct administration to patients

Abstract

An apparatus, comprising: at least one processor for obtaining a signal and/or state representing behavioral summary content of a particular user, the behavioral summary content comprising a plurality of parameters representing a current behavioral state or a biological state, or a combination thereof, of the particular user; at least one memory for storing signals and/or states representing behavior content; wherein the at least one processor is configured to generate one or more recommendations for the particular user based at least in part on the behavioral summary content or based at least in part on one or more parameters representing external factors, or a combination thereof, the one or more recommendations being for an improvement in a future state of the particular user.

Description

Systems, devices, and/or processes for behavioral and/or biological state processing
Technical Field
The subject matter disclosed herein may relate to systems, devices, and/or processes for processing signals and/or states representing behaviors and/or biological states.
Background
Integrated circuit devices, such as processors, may exist in a wide variety of electronic device types. For example, the one or more processors may be used in mobile devices (e.g., cellular telephones), as well as in computers, digital cameras, tablet devices, personal digital assistants, wearable devices, and so forth. The mobile device and/or other computing device may include, for example, an integrated circuit device, such as a processor, to process signals and/or states representing a wide variety of content types for a variety of purposes. Signal and/or state processing techniques continue to evolve as a vast variety of content is accessible. However, sometimes processing signals and/or states representing diverse content may prove to be relatively resource demanding, which may present several challenges, including, for example, increased processing time, memory requirements, complexity, cost, and so forth.
Drawings
The claimed subject matter is particularly pointed out and distinctly claimed in the concluding portion of the specification. However, both as to organization and/or method of operation, together with objects, features, and/or advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying drawings in which:
FIG. 1 is an illustration of an example mobile device according to an embodiment.
FIG. 2 is an illustration of an example processor for processing signals and/or states representing behavior content in a computing device according to an embodiment.
FIG. 3 is an illustration of an example apparatus, system, and/or process for processing signals and/or states representing behavioral summary content, according to an embodiment.
FIG. 4 is an illustration of an example apparatus, system, and/or process for processing signals and/or states representing behavioral summary content, according to an embodiment.
FIG. 5 is an illustration of an example apparatus, system, and/or process for processing signals and/or states representing behavior summary content, according to an embodiment.
FIG. 6 is an illustration of an example apparatus, system, and/or process for processing signals and/or states representing behavior summary content, according to an embodiment.
FIG. 7 is a schematic block diagram depicting an example processor for processing signals and/or states representing behavioral summary content in a computing device, according to an embodiment.
FIG. 8 is an illustration of an example process for processing signals and/or states representing behavior summary content in accordance with an embodiment.
FIG. 9 is an illustration of an example process for processing signals and/or states representing behavior summary content in accordance with an embodiment.
FIG. 10 is an illustration of an example apparatus, system, and/or process for processing signals and/or states representing behavior content, according to an embodiment.
FIG. 11 is an illustration of an example process for processing signals and/or states representing behavioral summary content, according to an embodiment.
FIG. 12 is an illustration of an example process for processing signals and/or states representing behavior summary content, according to an embodiment.
FIG. 13 is an illustration of an example process for processing signals and/or states representing behavioral summary content, according to an embodiment.
FIG. 14 is an illustration of an example process for tracking signals and/or states representing behavioral summary content, according to an embodiment.
FIG. 15 is a schematic block diagram of an example computing device, according to an embodiment.
Detailed Description
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof, and in which like numerals may designate corresponding and/or similar parts throughout. It will be appreciated that for simplicity and/or clarity of illustration, for example, the figures have not necessarily been drawn to scale. For example, the dimensions of some of the aspects may be exaggerated relative to other. Additionally, it is to be understood that other embodiments may be utilized. In addition, structural and/or other changes may be made without departing from claimed subject matter. Reference throughout this specification to "claimed subject matter" means subject matter that is intended to be covered by one or more claims, or any portion thereof, and is not necessarily intended to refer to a complete claim set, a particular combination of claim sets (e.g., a method claim, an apparatus claim, etc.), or a particular claim. It should also be noted that directions and/or references, such as upper, lower, top, bottom, and the like, may be used to facilitate discussion of the figures and are not intended to limit application of the claimed subject matter. The following detailed description is, therefore, not to be taken in a limiting sense, and the claimed subject matter and/or equivalents are not to be taken.
Reference throughout this specification to one implementation, an implementation, one embodiment, an embodiment, etc., means that a particular feature, structure, characteristic, etc., described in connection with the particular implementation and/or embodiment is included in at least one implementation and/or embodiment of claimed subject matter. Thus, appearances of such phrases, for example, in various places throughout this specification are not necessarily intended to refer to the same implementation and/or embodiment or to any one particular implementation and/or embodiment. Furthermore, it is to be understood that the particular features, structures, characteristics, etc., described are capable of being combined in various ways in one or more implementations and/or embodiments and are therefore within the intended scope of the claims. Of course, in general, these and other issues may vary in a particular context of use, as has been the case in the specification of the patent application. In other words, throughout the patent application, the particular context of description and/or use provides helpful guidance as to reasonable inferences to be drawn; similarly, however, the general term "in this context" without further limitation refers to the context of the present patent application.
As previously mentioned, integrated circuit devices, such as processors, may be present in a wide variety of electronic device types. For example, the one or more processors may be used in mobile devices (e.g., cellular telephones), as well as in computers, digital cameras, tablet devices, personal digital assistants, wearable devices, and so forth. The mobile device and/or other computing device may include, for example, an integrated circuit device, such as a processor, to process signals and/or states representing a wide variety of content types for a variety of purposes. Signal and/or state processing techniques continue to evolve as a vast variety of content is accessible. However, sometimes processing signals and/or states representing diverse content may prove to be relatively resource demanding, which may present several challenges, including, for example, increased processing time, memory requirements, complexity, cost, and so forth.
In one embodiment, content, such as behavioral summary content for a particular user, may be processed to generate recommendations, e.g., recommendations for a particular user. For example, content obtained at least in part via one or more sensors may be processed to generate behavioral summary content for a particular user, and/or the behavioral summary content may be utilized, at least in part, to generate recommendations for a particular user for a current and/or future behavior and/or biological state of the particular user. Additionally, in an embodiment, the behavioral summary content may be processed to detect "silent favorites" (silent like). "silent favorites" and the like refer to at least partially non-explicit indications of approval, appreciation, and the like, of content consumed by a particular user. For example, one or more sensors may detect one or more behavioral and/or biological aspects of a particular user (e.g., nodding of a head, dilation of a pupil indicating dopamine release, etc.), which may be understood to indicate approval, appreciation, etc., of content consumed by that particular user.
In another embodiment, behavioral summary content for a particular user may be processed to generate and/or select customized content for consumption by the particular user. In other embodiments, the behavioral summary content may be processed, at least in part, to track performance changes with respect to a particular user, to improve player health, and/or to provide collaborative mental health management, as discussed more fully below. Additional embodiments may include a division of responsibility in a technical assistance task between an operator and a computing device based at least in part on behavioral summary content of the operator. Of course, these are merely examples of how behavioral summary content may be processed and/or otherwise utilized and the scope of the subject matter is not limited in these respects.
In one embodiment, content, such as behavioral summary content of a particular user, may be tracked, where the behavioral summary content may include a plurality of parameters representing a current behavioral state or a biological state of the particular user, or a combination thereof. The tracked signals and/or states representing behavior content may be stored in at least one memory. Additionally, an embodiment may include determining, at least in part via at least one processor executing one or more machine learning operations, one or more relationships between the tracked behavioral summary content and a bioavailability or balance of one or more particular substances within a body of a particular user, or a combination thereof. An embodiment may also include generating, via at least in part via the at least one processor, one or more recommendations for supplements related to a particular one or more substances for a particular user, wherein the one or more recommendations may be for an improvement in a subsequent state of the particular user. Of course, these are merely examples of how behavioral summary content may be processed and/or otherwise utilized and the scope of the subject matter is not limited in these respects.
In one embodiment, content obtained from one or more sensors may be processed by specific hardware circuitry to generate behavioral summary content representing the physical, psychological and/or emotional state of a particular operator. For example, a processor, such as a behavioral processing unit, may be dedicated, at least in part, to processing sensor content to generate behavioral summary content representing physical, psychological, and/or emotional states of a particular operator. The processor, such as a behavior processing unit, may in one embodiment include specific circuitry that relatively more efficiently processes sensor content for performing specific operations to generate behavior summary content for a specific operator. For example, in an embodiment, a processor, such as a behavior processing unit, may include machine learning acceleration circuitry that is directed to performing certain operations that may act relatively more efficiently on a set of parameters, such as a multidimensional set of parameters, that may be utilized in various machine learning techniques (e.g., neural networks), as discussed more fully below. In an embodiment, a processor, such as a behavioral processing unit, may, for example, include a coprocessor that may operate in cooperation with a general-purpose processor, although claimed subject matter is not limited in this respect.
The terms "operator" and/or "user" refer to a human individual, and/or may be utilized interchangeably herein. In one embodiment, an operator and/or user may operate the machine, although the scope of the subject matter is not limited in these respects. Additionally, as utilized herein, a "machine" refers to an article of manufacture, such as a mechanically, electrically, and/or electronically operated device for performing a task. In some embodiments, the operations of the machine may be performed by a combination of operators and/or computing devices, and/or the operations of the machine may be based at least in part on a behavioral profile of at least one particular operator, as explained more fully herein.
As utilized herein, "behavioral summary" and the like refer to one or more parameters representing the current behavioral state or logical state, or a combination thereof, of at least one particular operator. Thus, for example, aspects of behavior such as "behavioral summary" are not limited to only the current state of a particular operator, but may also include parameters representing one or more biological aspects about a particular operator, as explained more fully herein. In addition, although some embodiments herein may be described in connection with "an" operator and/or "a" particular operator, the subject matter is not limited to a single operator. For example, at least some embodiments may include behavioral summary content for one or more operators, although, as such, the scope of the claimed subject matter is not limited in these respects.
Additionally, as utilized herein, the terms "current" and the like refer to being substantially and/or approximately current at a point in time. For example, a "current" behavior and/or biological state of a particular operator refers to the behavior and/or biological state of that particular operator that is derived, at least in part, from relatively recent sensor content. For example, in an embodiment, the operator-specific behavioral summary content may represent the operator-specific behavioral and/or biological state derived at least in part from sensor content obtained from one or more sensors within a fraction of a second of generation.
Fig. 1 is an illustration of an embodiment 100 of an example mobile device. In an embodiment, a mobile device, such as 100, may include one or more processors, such as processor 110 and/or a Behavioral Processing Unit (BPU) 200, and/or may include one or more communication interfaces, such as communication interface 120. In an embodiment, one or more communication interfaces, such as communication interface 120, may enable wireless communication between a mobile device, such as mobile device 100, and one or more other computing devices. In an embodiment, wireless communication may occur substantially in accordance with any of a wide variety of communication protocols, such as those mentioned herein.
In one embodiment, a mobile device, such as mobile device 100, may include memory, such as memory 130. In one embodiment, memory 130 may comprise, for example, non-volatile memory. Additionally, in one embodiment, a memory, such as memory 130, may have executable instructions stored therein, such as with respect to one or more operating systems, communication protocols, and/or applications. The memory, such as 130, may also store specific instructions, such as BPU code 132, that may be executed by the behavior processing unit (e.g., 200) to generate, at least in part, behavior summary content. Additionally, in an embodiment, a mobile device, such as mobile device 100, for example, may include a display, such as display 140, one or more sensors, such as one or more sensors 150, one or more cameras, such as one or more cameras 160, and/or one or more microphones, such as microphone 170.
Although the BPU 200 is described as executing instructions, such as BPU code 132, other embodiments of a behavioral processing unit may not fetch and execute code. In an embodiment, the behavior processing unit may include dedicated and/or specialized circuitry for processing sensor content and/or for generating behavior summary content, as described more fully below.
As utilized herein, "sensor" and the like refer to such devices and/or components: the device and/or component may be responsive to a physical stimulus, such as heat, light, acoustic pressure, magnetism, a particular motion, etc., and/or may generate one or more signals and/or states in response to a physical stimulus. Thus, while the camera 160 and/or microphone 170 are depicted in FIG. 1 as being separate from the sensor 150, in an embodiment the term "sensor" or the like may include a microphone and/or a camera. Example sensors may include, but are not limited to, one or more accelerometers, gyroscopes, thermometers, magnetometers, barometers, light sensors, proximity sensors, heart rate monitors, sweat sensors, hydration sensors, respiration sensors, and the like, and/or any combination of these. In an embodiment, one or more sensors may monitor one or more aspects of a biological and/or behavioral state of a particular operator.
In an embodiment, to generate behavioral summary content for a particular operator, a computing device, e.g., mobile device 100, may obtain signals and/or states representing the content from one or more sensors (such as one or more of sensor 150, camera 160, and/or microphone 170, or any combination of these). Additionally, in an embodiment, a processor, such as the behavior processing unit 200, may process sensor content, such as content from one or more of the sensors 150, the camera 160, and/or the microphone 170, or any combination of these, to generate behavior summary content for a particular operator. In one embodiment, a processor, such as behavior processing unit 200, may include behavior content processing circuitry. For example, a processor, such as behavior processing unit 200, may include machine learning acceleration circuitry in one embodiment.
For example, a processor, such as behavior processing unit 200, may include one or more arithmetic units directed to operations involving a relatively large set of parameters, such as a set of parameters that may be used in machine learning (such as neural networks). In an embodiment, machine learning acceleration circuitry, such as an arithmetic unit for operations involving a neural network parameter set and/or other relatively large parameter sets, may be utilized to process sensor content to generate behavioral summary content for a particular operator. In an embodiment, the behavior profile content may be utilized to affect the operation of a particular machine, generate recommendations for the behavior and/or biological state of a particular operator for a particular operator, generate customized content for consumption by a particular operator, and so on, to name a few non-limiting examples.
In one embodiment, a general purpose processor, such as processor 110, and a behavioral processing unit, such as 200, may comprise separate integrated circuit devices. In other embodiments, a general purpose processor, such as processor 110, and a behavioral processing unit, such as 200, may be formed on the same integrated circuit die and/or integrated circuit package. Additionally, in one embodiment, a processor, such as behavior processing unit 200, may include a coprocessor that may operate in cooperation with a general-purpose processor, such as 110. For example, a processor, such as 110, may execute code that includes an operating system, applications, and the like. Additionally, in an embodiment, a behavior processing unit, such as 200, may perform operations directed to generating behavior summary content for one or more operators. For example, a behavior processing unit, such as 200, may include circuitry for executing a particular instruction and/or set of instructions, such as code 132, relatively more efficiently for acting on a relatively larger set of parameters, such as may be utilized in connection with a particular machine learning technique (e.g., including a neural network).
In an embodiment, behavior summary content, such as may be generated by a behavior processing unit (e.g., 200), may be communicated between the behavior processing unit (e.g., 200) and any of a variety of devices, systems, and/or processes. For example, behavior summary content generated by behavior processing unit 200 may be stored in a memory (e.g., 130) and/or may be pushed and/or otherwise provided to processor 110 and/or other devices and/or systems. In an embodiment, the behavioral summary content may be communicated between a computing device (such as mobile device 100) and one or more other network devices (such as one or more other computing devices) via one or more wired and/or wireless communication networks. Of course, the scope of the subject matter is not limited in these respects.
In an embodiment, the behavioral summary content may include a particular specified set of parameters representing a behavior and/or biological state of a particular operator, which may be utilized, at least in part, by any of a wide variety of devices, systems, and/or processes for any of a wide variety of applications and/or purposes. In an embodiment, by generating a specified set of parameters that includes behavioral summary content, other devices, systems, applications, and/or processes, for example, may be relieved of responsibility for generating behavioral summary content and may instead focus on specific areas of expertise and/or expertise. For example, an application developer may design an application to utilize one or more parameters of behavior summary content for one or more particular operators without incurring the cost (time, money, resources, etc.) of developing circuitry, code, etc. for collecting and/or processing sensor content and/or for generating behavior summary content.
Although fig. 1 depicts an embodiment of a mobile device, such as mobile device 100, other embodiments may include other types of computing devices. Example types of computing devices may include, for example, any of a wide variety of digital electronic devices, including, but not limited to, desktop and/or notebook computers, high definition televisions, digital video players and/or recorders, gaming consoles, satellite television receivers, cellular telephones, tablet devices, wearable devices, personal digital assistants, mobile audio and/or video playback and/or recording devices, or any combination of the foregoing.
Fig. 2 is an illustration of an embodiment 200 of a processor (such as a behavior processing unit) processing signals and/or states representing behavior content in a computing device. In one embodiment, to generate behavior profile content, such as behavior profile content 240, for a particular user, a processor, such as behavior processing unit 200, may obtain signals and/or states representing the content from one or more sensors, such as one or more of sensors 230. Additionally, in an embodiment, a processor, such as behavior processing unit 200, may process sensor content, such as content from one or more of sensors 230, to generate behavior summary content, such as behavior summary content 240, for a particular user. In one embodiment, a processor, such as behavior processing unit 200, may include behavior content processing circuitry. For example, a processor, such as behavior processing unit 200, may include, in an embodiment, sensor parameter processing circuitry, such as circuitry 210, and/or may include machine learning acceleration circuitry, such as circuitry 220.
In an embodiment, a processor, such as behavior processing unit 200, may provide circuitry to generate behavior summary content, such as behavior summary content 240, at least in part for a particular user to be utilized with any of a wide variety of possible applications, such as the example applications described herein. For example, in an embodiment, the behavior summary content may be provided to a decision-making device, process, and/or system, such as decision-making device, system, and/or process 250. In an embodiment, a processor, such as behavior processing unit 200, for example, may process signals and/or states associated with a particular user and/or a particular environment obtained from one or more behavior, biological, and/or environmental sensors, or a combination thereof, relatively more efficiently. In an embodiment, a processor, such as behavior processing unit 200, may compute probabilities for "hidden" (e.g., emotional) states of a particular operator, at least in part, by generating behavior summary content (such as behavior summary content 240) that may be utilized by one or more devices, systems, and/or processes (such as decision-making device, system, and/or process 250) for any of a variety of possible purposes. For example, as described above, behavioral summary content may be utilized to generate recommendations for a particular user's behavioral and/or biological state, detect silent favorites, generate customized content for consumption by a particular user, track performance changes with respect to a particular user, improve player health, and/or provide collaborative mental health management, to name a few non-limiting examples. Other examples may include determining a division of responsibility between an operator and a machine in the case of shared responsibility (e.g., technology-assisted driving, flight, drone operation, etc.). For example, if the operator is determined to be angry, the weapon may not be allowed to fire. In addition, for example, if the driver is determined to be angry, the automobile may not be allowed to accelerate quickly, and/or if the driver is determined to be tired, the automobile may increase its impact on handling. Of course, the scope of the subject matter is not limited in these respects.
As described above, in an embodiment, behavioral summary content of a particular user may be tracked, wherein the behavioral summary content may include a plurality of parameters representing a current behavioral state or biological state, or a combination thereof, of the particular user. Additionally, in an embodiment, one or more relationships between the tracked behavioral summary content and the bioavailability or balance of one or more particular substances within the body of a particular user, or a combination thereof, may be determined, for example. An embodiment may also include generating one or more recommendations for supplements related to a particular one or more substances for a particular user, where the one or more recommendations may be directed to an improvement in a subsequent state of the particular user. Of course, the scope of the claimed subject matter is not limited in these respects.
In an embodiment, machine-based decision making based at least in part on behavioral summary content, such as decision making apparatus, system, and/or process 250, may include, for example, supplemental dynamic recommendations for a particular user related to a particular one or more substances targeted to improve the current and/or subsequent state of the particular user. In an embodiment, the sensor content may include content from any of a wide variety of possible sources and/or may be variable content. In an embodiment, a processor, such as a behavior processing unit, may include machine learning (e.g., neural networks, etc.), at least in part, to adapt to the presence and/or absence of one or more particular sensors while providing, for example, probabilities represented, at least in part, by behavior summary content.
In an embodiment, machine-based decision making based at least in part on behavioral summary content, such as decision making devices, systems, and/or processes 250, may include dynamic content creation and/or may include physical control of devices that may affect the security of a particular operator and/or user and/or individual. In an embodiment, the sensor content may include content from any of a wide variety of possible sources and/or may be variable content. In an embodiment, a processor, such as a behavior processing unit, may include machine learning (e.g., neural networks, etc.), at least in part, to adapt to the presence and/or absence of one or more particular sensors while providing, for example, probabilities represented, at least in part, by behavior summary content.
In an embodiment, machine-based decision making, such as may be performed by the decision making device, system, and/or process 250, may depend, at least in part, on the operator's current status and/or the operator's ability to respond relatively quickly to changes in the operator's status, for example. A wide variety of possible sensor types may provide content indicative of various aspects of a particular operator's biological and/or behavioral state and/or indicative of one or more environmental factors and/or other external factors. In an embodiment, a processor, such as behavior processing unit 200, may include a sensor parameter processing unit, such as sensor parameter processing unit 210. In an embodiment, a sensor parameter processing unit, such as sensor parameter processing unit 210, may obtain signals and/or conditions from one or more sensors (such as sensor 230), and/or may process signals and/or conditions from one or more sensors to combine, coordinate, normalize, and/or otherwise condition signals and/or conditions from one or more sensors.
For example, a sensor parameter processing unit, such as sensor parameter processing unit 210, may prepare the sensor content for further processing (e.g., via a machine learning operation). In an embodiment, a machine learning acceleration circuit, such as machine learning acceleration circuit 220, may process, at least in part, sensor content to infer a substantially current biological and/or behavioral state of a particular operator. For example, a camera sensor or the like may provide one or more signals and/or states to a sensor parameter processing unit, such as sensor parameter processing unit 210. The sensor parameter processing unit 210 may, for example, generate one or more parameters representing pupil dilation, focus, blink duration, and/or blink rate, or any combination of these.
In an embodiment, a machine learning acceleration circuit, such as machine learning acceleration circuit 220, may generate, at least in part, a representation of a particular operator's biometric and/or behavioral state, such as behavioral summary content 240. In an embodiment, behavior summary content, such as behavior summary content 240, may include a specified set of parameters that may be utilized by any of a variety of machine-based (e.g., computing device-based) decision-making systems, devices, and/or processes, such as decision-making device, system, and/or process 250. In an embodiment, the behavioral summary content may include a plurality of parameters representing focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, or any combination of these, related to a particular user. As additional non-limiting examples, the behavioral profile content may also include parameters representing pre-breakthrough, quiet love, regret/acknowledge mistakes, hunger, meadow/accuracy, comorbidities, confusion, and/or social engagement levels, and/or any combination of these.
In an embodiment, the behavioral summary content may include a specified set of parameters, such as at least a subset of those described above. In an embodiment, a processor, such as behavior processing unit 200, may generate a set of parameters representing behavior summary content specified for a manner of providing content regarding a particular user's behavior and/or biological state to any of a wide variety of devices, systems, and/or processes for any of a wide variety of purposes and/or applications. In addition, such a set of specified behavioral summary content parameters may be utilized simultaneously by any number of devices, systems, and/or processes.
In one embodiment, a processor, such as behavior processing unit 200, may repeatedly obtain sensor content and/or may repeatedly generate behavior profile content for a particular user. For example, sensor content may be collected and/or otherwise obtained at regular and/or specified intervals, and/or behavior summary content may be generated at regular and/or specified intervals. In an embodiment, one or more devices, systems, and/or processes may track behavioral summary content, e.g., over a period of time, to, e.g., detect changes in the behavioral summary content.
In an embodiment, a processor, such as behavior processing unit 200, may advantageously be utilized, at least in part, by dedicated computing resources, for example, to process sensor content, and/or to generate behavior summary content, such as 240, for a particular user. Additionally, by generating a specified set of parameters that includes behavioral summary content (such as 240), the system, apparatus, and/or process can be relieved of responsibility for generating the behavioral summary content and can, for example, focus on specific areas of expertise and/or profession. Additionally, development costs may be reduced for systems, devices, and/or processes due, at least in part, to the availability of a specified set of behavior summary content parameters from a processor (such as behavior processing unit 200).
In an embodiment, a processor, such as behavior processing unit 200, may incorporate sensor content (e.g., behavior and/or biosensor content, or a combination thereof) with representations of previous relationships (e.g., known and/or determined associations between measurable and/or human states) in substantially real-time. Additionally, in an embodiment, a processor, such as behavior processing unit 200, may utilize machine learning techniques (e.g., neural networks, etc.) to map incoming sensor content that represents one or more aspects of the operator's biological and/or behavioral state. In an embodiment, a processor, such as behavior processing unit 200, may include support for relatively more efficient coordination and/or processing of content obtained from a variety of possible sources (e.g., a combination of content from biometric and/or behavior sensors and/or content representing other factors) to generate a specified set of parameters, such as behavior summary content 240. Additionally, in an embodiment, one or more memory devices may be provided to store operator-related content and/or operator-unrelated content to enable relatively more rapid identification of state changes in a particular user.
In an embodiment, machine learning operations, such as may be performed by a processor (such as behavior processing unit 200), may store user-specific content in one or more memory devices, for example, and/or may also store user-generic content (e.g., determined and/or substantially known relationships between sensor content and/or user states). In an embodiment, user-specific content and/or user-generic content may be processed, for example via machine learning operations, to generate one or more output state vectors, such as behavior summary content 240.
In an embodiment, a processor, such as behavior processing unit 200, may generate parameters representing substantially current behavior and/or biological state of a particular user through a combination of content from one or more sensors, knowledge of determined and/or substantially known associations and/or relationships, and/or machine learning. In an embodiment, behavioral summary content, such as 240, may include one or more parameters indicating, for example, a score for a user's state (such as anger, excitement, fatigue, distraction, and so forth). In addition, the utilization of a relatively greater amount of content from the sensors may allow for improvements in user status and/or may allow for better differentiation between user statuses. For example, in one embodiment, both fear and/or excitement may increase heart rate, but fear, lack of excitement, may cause binding of the user's shoulders. The behavior processing unit, such as 200, may in an embodiment distinguish between fear and/or excitement of the user based at least in part on content obtained from one or more cameras. Of course, the scope of the subject matter is not limited in these respects.
In an embodiment, external factors may play a role in generating behavioral summary content and/or decision making. For example, one or more parameters indicative of one or more external factors, such as external factor parameters 260, may be obtained by a behavior summary unit (such as BPU 200) and/or may be obtained by a decision-making device, system, and/or process (such as decision-making system 250). The parameters indicative of the external factors may include, for example, parameters indicative of location, time period, presence and/or identity of the external individual, and/or general mood. Additionally, in one embodiment, parameters, such as parameter 270, may be obtained from a user. For example, a user may provide input parameters representing instances of supplementation and/or consumption of a particular substance (e.g., nutritional supplement, medicine, food, beverage, etc.). For example, input parameters 270 may include parameters representing substance identity, quantity, time and/or date of replenishment, and so forth.
Fig. 3 is an illustration of an embodiment 300 of an example apparatus, system, and/or process for processing behavioral summary content, such as behavioral summary content 240, for any of a wide variety of possible applications and/or purposes. As described above, a processor, such as behavior processing unit 200, may generate behavior summary content, such as behavior summary content 240, based at least in part on sensor content, such as content from one or more of sensors 230. In an embodiment, behavioral summary content, such as behavioral summary content 240, may be utilized, at least in part, to evolve and/or adjust virtual reality and/or games and/or other content that a particular user/player (such as user 340) is consuming and/or is about to consume based, at least in part, on a current biological and/or behavioral state of the user/player (such as user 340). In this manner, for example, content, such as video game content, may be customized for a particular user/player (such as user 340) to induce a desired state in the particular user relatively more efficiently.
Additionally, in an embodiment, behavioral summary content, such as behavioral summary content 310, may be utilized, for example, at least in part, to track one or more relationships between the behavioral summary content and the bioavailability or balance of one or more particular substances within the body of a particular user, or a combination thereof. An embodiment may also include generating one or more recommendations for supplements related to one or more particular substances for a particular user, where the one or more recommendations may be directed to improvements in a current and/or subsequent state of the particular user. For example, in an embodiment, content from one or more cameras directed at the user's eyes, microphones, skin sensors measuring sweat and/or temperature, one or more pressure sensors for the user's fingers, heart rate monitors, hydration sensors, and/or respiration monitors, or any combination of these, may be utilized as part of the supplemental recommendation system.
In an embodiment, content from one or more cameras directed at the user's eyes, microphones, skin sensors measuring sweat and/or temperature, one or more pressure sensors for the user's fingers, heart rate monitors, hydration sensors, and/or respiration monitors, or any combination of these, may be utilized, for example, as part of an immersive gaming system or a supplemental recommendation system. In an embodiment, a processor, such as behavior processing unit 200, may obtain content from one or more sensors (such as the one or more sensors mentioned above) and/or may generate a set of parameters, such as behavior summary content 240, that represent a substantially current biological and/or behavioral state of a particular user/player. Sensor content obtained via the camera, for example, may be processed to generate behavioral summary content, such as behavioral summary content 240, representative of pupil dilation, focus, blink duration, and/or blink rate, to name a few non-limiting examples. Additionally, digital audio content, such as that available via a microphone, for example, may be processed to generate behavioral summary content, such as behavioral summary content 240, representing, for example, volume, tone, and/or emotion.
In an embodiment, determined and/or substantially known relationships, such as relationship parameters 314, may include relationships between behavioral summary content and/or user states and/or may include scientifically determined relationships. In an embodiment, the relationship between content that may be gleaned from the sensor output and the behavior and/or biological state of the user may be determined, at least in part, via one or more scientific publications (such as scientific publication parameters 315). For example, pupil dilation may be linked to dopamine release according to one or more scientific publications, such as from National Institutes of Health (NIH). In addition, for example, dopamine release may be linked to an expected and/or relatively exacerbated emotional response to stimuli according to one or more scientific studies (e.g., as may be published by the Nature journal of research). In an embodiment, content representing relationships gleaned from scientific publications may be stored in at least one memory (e.g., memory 130) and/or may be provided to an apparatus, system, and/or process for performing machine learning operations, such as shown at machine learning 320.
In an embodiment, other relationships may be determined offline through content collection and/or machine learning in addition to and/or instead of scientifically determined relationships. For example, a test subject, such as one or more users and/or operators, may provide behavioral and/or biological status content and/or may provide biometric and/or behavioral tagging content such that a relationship between such a biomarker and behavioral and/or biological status may be determined, such as learned relationship parameters 316. In one embodiment, for example, a biomarker may be recorded for a test subject as the test subject is presented with content and/or substances intended to evoke a particular state. Via machine learning, such as machine learning 320, in one embodiment a variety of biometric and/or behavioral states of a user may be learned to be identified by biometric and/or behavioral indicia.
In an embodiment, an apparatus, system, and/or process, such as machine learning 320, may generate recommendations (such as recommendation parameters 330) for a particular user (such as user 340) based at least in part on behavior summary content (such as behavior summary content 240). In this context, "recommendation" refers to one or more indications of suggestions and/or actions that may be taken by one or more individuals, devices, systems, and/or processes. For example, a recommendation, such as recommendation parameters 330, may include suggestions presented to a particular user for improving one or more aspects of a behavior and/or biological state of the particular user. As another example, a recommendation, such as recommendation parameter 330, may be directed to the replenishment of one or more particular substances. Additionally, for another example, a recommendation, such as recommendation parameter 330, may indicate to a device, system, and/or process to alter content, such as video game and/or virtual reality content that a particular user is consuming and/or is about to consume. Additionally, for example, recommendations, such as recommendation parameters 330, may configure and/or reconfigure devices, systems, and/or processes to alter the division of responsibility between an operator and a device, system, and/or process in situations where responsibility is shared (e.g., technology assisted driving, flight, drone operation, etc.). Of course, the scope of the claimed subject matter is not limited in these respects.
As described above, in an embodiment, a gaming and/or virtual reality system may utilize behavioral summary content (such as behavioral summary content 240) of a particular user (such as user 340) to evolve the content such that the gaming and/or virtual content may be adjusted to produce a desired effect in the particular user. For example, an apparatus, system, and/or process, such as machine learning 320, may detect a decrease in the excitement level of the user based at least in part on the behavioral summary content (such as behavioral summary content 240). In one embodiment, the gaming system may respond to a decrease in excitement level at least in part by altering the content presented to the operator in an attempt to attract the operator's interest and/or otherwise cause an increase in the level of interest of the operator. Of course, responding to a change in the user's interest level is only one way in which the gaming and/or virtual reality system may advantageously utilize behavioral summary content (such as behavioral summary content 240). Another example may include adjusting streaming media content, such as internet-based audio streaming media, to induce a particular state in a particular user. For example, a streaming radio station (e.g., Pandora, Spotify, Apple Music, etc.) may alter the content of a "relaxing" radio station and/or playlist based at least in part on behavioral summary content to induce a relaxed state in a particular user. This may have particular advantages over generic play-out type content streams that are not tailored to a particular individual, over streams that are based on explicit user "favorites" and/or over streams that do not take into account the biological and/or behavioral state of a particular user.
For example, a content providing device, system, and/or process, such as a game and/or virtual reality system, for example, may present content intended to produce a specified state, such as fear, excitement, relaxation, and/or the like, to a particular user and/or a particular group of users. Behavior profile content, such as behavior profile content 240, and/or such as may be generated by a behavior processing unit (such as behavior processing unit 200), for example, may be monitored to track various status levels of a user and/or a group of users. In an embodiment, a content providing device, system, and/or process may test various versions of content on a user and/or group of users to obtain, for example, knowledge about a particular user and/or group of users via a behavior processing unit (such as behavior processing unit 200). For example, a content provider may wish to test versions of content for different levels of fear of snakes, zombies, and so forth. Of course, the subject matter is not limited with respect to the type of fear and/or other behaviors and/or biological responses that may be tested.
In an embodiment, a content providing device, system, and/or process may record content status scores for individual users and/or groups of users. The content status score may be utilized, for example, by the content provider to select and/or generate user-specific content intended to produce a desired status in a particular user (e.g., replace spiders with snakes if the particular user indicates a greater fear of snakes through the behavioral summary content). In an embodiment, such customization may occur without knowledge about the particular user, and/or the content providing device, system, and/or process may test and/or evolve over time based at least in part on detected changes in the evolving biological and/or behavioral state of the particular user.
In an embodiment, customization of content based at least in part on, for example, a particular user's biological and/or behavioral state indicated, for example, at least in part, via behavioral summary content (such as behavioral summary content 240) that may, for example, be generated by a behavioral processing unit (such as behavioral processing unit 200) may, for example, be advantageously used in a targeted advertising system. In an embodiment, advertisements, such as online advertisements, may be generated and/or selected based at least in part on, for example, the biological and/or behavioral state of a particular user indicated at least in part via behavioral summary content (such as behavioral summary content 240). For example, a billboard presented to a particular user from within a video game may depict a soft drink in response, at least in part, to determining that the particular user is likely to be dehydrated based, at least in part, on behavioral summary content of the particular user. Similarly, for example, in response to determining that the user is hungry based at least in part on the behavioral summary content, an advertisement for a quick restaurant may be displayed to the user. Of course, these are merely example advertisements and the scope of the subject matter is not limited in these respects.
Customization of advertisements based at least in part on behavioral summary content may also include, for example, determining favorable points in time for displaying a particular advertisement to a particular user. For example, particular advertisements regarding particular brands, such as within video games and/or other digital content, may be timed to coincide with dopamine release and/or silent favorites (as non-limiting examples), such that the particular user may associate his or her feelings of well-being with the advertised brand.
In another embodiment involving customization of content, such as adapting video game content for one or more users based at least in part on behavioral summary content, may include generating and/or selecting content based at least in part on behavioral summary content for a plurality of users. For example, in a video game system, if a particular user "1" is afraid of a particular game element "x" (e.g., can be determined based at least in part on the behavioral summary content of user 1) and if a user "2" is afraid of a particular game element "y" (e.g., can be determined based at least in part on the behavioral summary content of user 2), and if a particular video game involves, for example, user 1 and user 2 collaborating against a monster, the video game system can generate a monster that is made up at least in part of elements x and y. Also, the scope of the subject matter is not limited to these particular examples.
In another embodiment, customization of content based at least in part on, for example, a biological and/or behavioral state of a particular user indicated at least in part via behavioral summary content (such as behavioral summary content 240) may be advantageously used, for example, in an educational context. In an embodiment, educational content for a technically-assisted teaching system and/or for a virtual reality-based educational tool may be selected, altered, and/or otherwise adjusted for one or more particular users based at least in part on the biological and/or behavioral states of the one or more particular users. For example, educational content may be selected, altered, and/or otherwise adjusted based at least in part on an indication of pre-breaching, concentration/distraction, and/or remorse/recognition. Similarly, for example, indications of co-occurrence from behavioral summary content may support adjusting a virtual teacher so that students can respond well to the virtual teacher.
In other embodiments, recommendations may be provided to a user (such as a player) to improve user health based at least in part on behavior summary content (such as behavior summary content 240). For example, embodiments may include identification of a biological condition, such as via monitoring and/or tracking behavioral summary content, for example, and may also include generating advisory content to encourage the user to take specific actions outside the gaming environment with the goal of improving user/player health. For example, if a system, apparatus, and/or process, such as system, apparatus, and/or process 320, determines from behavior summary content (such as behavior summary content 240) that a user (such as behavior summary content 240) is likely to be dehydrated, a message may be displayed and/or otherwise communicated to the user within the game and/or virtual reality environment as a reminder to take a break and/or seek nutrition. Similar recommendations may be made with respect to food, sleep, medication, supplements, and the like. Additionally, game play may be altered and/or otherwise adjusted based at least in part on a biological condition indicated at least in part via behavioral summary content (such as behavioral summary content 240). For example, if behavior summary content, such as behavior summary content 240, indicates that the user may be hungry, a content providing system, such as a gaming system, may change the character meeting location from a park to a food facility. Such a change may serve as a reminder that the user/player seeks nutrition.
In an embodiment, characters and/or other games and/or virtual reality elements may be utilized to communicate recommendations to particular players. For example, within a game and/or virtual reality environment, character meetings may be scheduled to occur within a restaurant and/or other food facility to suggest that it may be time to regulate blood glucose. Additionally, in an embodiment, a user, such as a player, may provide instructions, such as via a user interface of the gaming system and/or other computing device, for example, to the game and/or virtual reality system. A content providing system, such as a gaming system, may modify and/or otherwise adjust content based at least in part on user input. For example, the user may provide input regarding the user's favorite treats, and these favorite treats may be incorporated into recommendations generated by the system regarding nutrition.
Additionally, in an embodiment, a device, system, and/or process, such as machine learning 320, may track performance changes for a particular user, such as user 340. For example, in a collaborative task where a human operator and/or a computer system may divide responsibility (e.g., a technology-assisted driving or flight), it may be desirable for the computer system to understand the operator's performance changes, which may indicate the operator's ability to continue to be responsible for the particular task. Similarly, in an adaptive gaming experience, the gaming system may advantageously utilize indications that an operator may perform above and/or below a particular baseline. For example, the gaming system may adjust the game content to account, at least in part, for the operator performance level.
In one embodiment, a performance test may be performed on a particular user/operator, which may require an action in response to the presented content. For example, performance testing may involve placing objects at various screen locations that may require interaction with relatively higher accuracy and/or introducing a sequence of challenges that may require relatively rapid response. In one embodiment, such testing may be "hidden" in that the player may not be aware that they are being tested, for example.
In one embodiment, an external system, such as a gaming and/or virtual reality system, may uniquely tag and/or otherwise number performance tests. The external system may notify the behavior processing unit (such as BPU 200) of the start of a performance test and/or may also provide a unique tag and/or number associated with a particular performance test. The BPU 200 may, for example, track a user's biological response (e.g., heart rate, sweat, etc.) to the performance test. Additionally, in one embodiment, an external system may indicate completion of the performance test to the BPU 200, and the BPU 200 may in turn provide relevant behavior summary content back to the outside. For example, a vector including the scoring parameters may be provided and/or may be stored at an external system. By storing performance content for individual users/players over time, and possibly including machine learning, the external system can identify how a particular user/player may perform compared to the typical performance of an individual.
Embodiments may also be utilized to identify manipulation patterns within content consumed by a user. In 1927, ien pavlov described the "adaptive response" of humans as an instinctive biological response to sudden exercise and fresh stimuli. This response may include temporary obstruction of alpha waves, increased vasodilation to the brain, decreased heart rate, and so forth. Researchers have subsequently investigated how various aspects of television content, including scene cuts, panning, clipping, and zooming, for example, may activate a biometric response. Content creators may utilize techniques to intentionally capture a consumer's attention through rapid feature changes that may activate the consumer's adaptive response relatively frequently and/or continuously. Such relatively frequent and/or continuous activation of an adaptive response can lead to addiction and/or other negative health consequences. By monitoring a user's biological and/or behavioral indicators, for example, via tracking focus shifts and/or heart rate drops, embodiments may identify patterns that may be intended to activate an adaptive response. Embodiments may further inform the user that such manipulation may be occurring.
Additionally, with possible input of additional biometric indicators obtained from the user, more sophisticated techniques and/or patterns may be developed (e.g., by the scientific community and/or the entertainment community) in an attempt to keep the user engaged and possibly addicted. For example, a pattern such as focus changing every 4 seconds for 2 minutes, then dopamine release every 30 seconds for 2 minutes, etc. may be found to significantly increase the likelihood that the user will participate for some particular amount of time. As such modes and/or techniques become more complex, a user may not be able to easily identify when he/she may be manipulated. In an embodiment, a determined and/or known manipulation pattern may be identified, at least in part, via tracking and/or monitoring of the biomarker, such as via generation of behavioral summary content output from the sensor. Embodiments may also notify the user of the existence of such a pattern in real time (e.g., substantially as it occurs).
In an embodiment, if presentation of content (e.g., games, television programs, movies, etc.) occurs within the context of determined and/or known manipulability techniques and/or modes, the user may be alerted, such as via display 140. By having a processor (such as BPU 200) detect such manipulative techniques and/or patterns, for example, at least in part via processing of sensor content and/or generation of behavioral summary content (such as behavioral summary content 240), a user may focus attention on the consumed content without also attempting to detect manipulative techniques and/or patterns. For example, a user/consumer may be fully engaged in content and may request to be notified in response to detection of manipulative techniques and/or patterns. Users, such as players, for example, may generally benefit from such embodiments even if no request is made to be notified of detected manipulative techniques and/or patterns, at least in part because content providers may be encouraged (e.g., via the possibility of exposure) to throttle utilization of such manipulative techniques and/or patterns. For example, just as including a calorie count on a menu may create an incentive for a restaurant to throttle the amount of calories, the identification of manipulative techniques and/or patterns may encourage content developers to throttle the use of manipulative techniques and/or patterns. Additionally, embodiments may include manipulability mode detection as part of a parental control system, for example.
Additional embodiments may include machine learning devices, systems, and/or processes for managing possible addiction and/or negative consequences that may result from a player-identified out-of-range condition. In an embodiment, the player/user may explicitly indicate, such as via input to a computing device (such as mobile device 100), that the player has entered an "out-of-range state". For example, in situations where the user may be playing late and not sleeping, or where the user may have forgotten an appointment, or where the user may recognize that a point at which eating is needed has been exceeded, and so on, the user may explicitly indicate such a situation via interaction with a user interface of a computing device (such as the mobile device 100). Based at least in part on user input regarding the out-of-range state, a processor, such as the BPU 200, may utilize machine learning and/or other analysis techniques to recognize and/or identify behavioral patterns associated with entering the out-of-range state. In an embodiment, a computing device, such as mobile device 100, may notify a user at least partially in response to identification of a pattern that may induce an out-of-range condition, such that the user may take appropriate steps (e.g., stop playing a particular game) to avoid the out-of-range condition and/or unwanted consequences.
Additionally, embodiments may include notifying the user of the recommended health tips when the recommended health tips are communicated to a content providing system (such as a gaming system). For example, a user may request to be reminded in response to a processor (such as BPU 200 and/or processor 110) generating a recommendation (such as recommendation 330) to be communicated to a content provider. For example, in situations where recommendation parameters (such as parameter 330) may be provided to the content provider for the content provider to embed suggestive content related to, for example, dehydration, notifications may also be provided to the user, such as via display 140, for example, independent of the game and/or content provider. Such communication directed to the user may help resolve the following scenario: wherein the user may otherwise rely on advisory content from a content providing system, such as a gaming system, to express an action that may improve the user's health. Further, such embodiments may provide the user with some visibility of the in-game targeted advertisements that may be affected by the user's behavior and/or biological state.
Additionally, in one embodiment, performance degradation may be predicted from biological changes. For example, in computer-assisted driving, a human operator may remain in their particular lane 3:00 pm and 3:00 am, but the pressure level may increase significantly by 3:00 am. A behavior processing unit, such as BPU 200, may detect such a biological indicator of stress, at least in part, through content obtained from one or more sensors. In an embodiment, the detection of the bio-stress indicator and/or the performance test content may be utilized in conjunction to predict an impending hazardous condition and/or make recommendations and/or take other appropriate actions.
In another embodiment, behavioral summary content (such as behavioral summary content 240), for example, provided by a behavioral processing unit (such as behavioral processing unit 200) may be utilized to help manage mental health of a user. For example, as described above, in an embodiment, external factors may play a role in generating behavioral summary content and/or decision making. For example, one or more parameters indicative of one or more external factors, such as external factor parameters 312, may be obtained by a behavior summary unit (such as BPU 200), and/or may be obtained by a decision-making device, system, and/or process (such as machine learning 320). The parameters indicative of the external factors may include, for example, parameters indicative of location, time period, presence, identity and/or status of the external individual, and/or general mood. By using external information, such as the location of the individual, the time period, the presence, identity, and/or status of other individuals, and/or emotional analysis from ambient microphones and/or other sensors that are always on, computer intelligence based on machine learning, such as performed at least in part by the BPU 200 and/or the decision-making device, system, and/or process 250, may recognize patterns and/or recommend actions for a particular individual (e.g., suggest changing locations), recommend actions for another individual (e.g., suggest calling a friend to look away), and/or forego actions with potential consequences (e.g., suggest reconsidering what to call a family member at that time). An objective of one or more embodiments may be to recommend actions and/or not actions to improve the level of emotional state (e.g., reduce unwanted states and increase wanted states).
In an embodiment, the presence, identity, and/or status of one or more external individuals may be determined at least in part via obtaining one or more signals and/or statuses from personal computing devices of the one or more external individuals. In other embodiments, a behavior processing unit, such as the BPU 200, may be utilized to determine the presence, identity, and/or status of one or more external individuals. For example, a behavior processing unit, such as BPU 200, may determine the presence, identity, and/or status of one or more external individuals, for example, at least in part by: the monitoring may be performed via voice monitoring (e.g., analysis of signals and/or states generated by a microphone), via detection of phone calls between the user and one or more particular external individuals, by detection of user interaction with social media content of one or more individual users, and/or via detection and/or monitoring of user discussions of one or more external individuals via phone calls, social media posts, email text, and/or audible conversations related to the one or more external individuals. In an embodiment, the presence, identity, and/or status of an external individual may be determined at least in part via monitoring of an individual user's interaction with the external individual and/or via monitoring of a user's explicit and/or implicit description of the external individual. In an embodiment, monitoring of an individual user's interaction with an external individual and/or monitoring of a user's description of the external individual may be performed, for example, at least in part by a behavior processing unit (such as BPU 200) based at least in part on sensor content (such as content from sensor 230).
Subtle patterns sometimes exist between changes in an individual's psychological/emotional state and external factors such as consumption of certain foods, various social interactions, and/or interactions with various media channels (e.g., television, radio, and/or websites). Undesirable emotional states may sustain social effects that may manifest as violence in the home and/or school, as well as addiction and/or suicidal rates. Embodiments may enable identification of subtle patterns that may lead to undesirable emotional states such as anger, and then recommendations may be made to prevent an individual from developing an undesirable mental state and/or returning to a more desirable mental state in advance.
In one example, household violence and/or other abuse may be avoided at least in part by identifying a pattern, such as over three hours, at a particular location (e.g., a bar, indicating alcohol consumption) and then eating at a particular restaurant (e.g., a steak restaurant, indicating red meat consumption), which may be associated with increased anger later in the evening. Embodiments may, for example, recommend that a person leave a particular location and/or suggest that an individual be vigilant for increased irritation.
As described above, the status of the external individual may be provided by a behavior processing unit, such as BPU 200. For example, behavioral summary content generated by the BPU 200 may include one or more parameters representing the emotional, behavioral, and/or biological state of one or more particular external individuals. In some cases, for example, the inclusion of content representing the state of an external individual may significantly improve the utility and/or quality of various embodiments. For example, without the state of the external individual, the machine learning device, system, and/or process, such as machine learning 320, may generate a recommendation, such as recommendation 330, indicating that it is unwise for the user to call the user's sister during tuesday nights because calling her during that time negatively impacts the user's mental state. In general, in this example, the user's sister typically has an important meeting on Tuesday, and she is stressed and angry on Tuesday evening, resulting in increased anger for the user. However, for this example, when the Tuesday meeting is cancelled, the user's sister may not be nervous and/or irritable. By utilizing, for example, content representing psychological and/or emotional states of external individuals as input, machine learning devices, systems, and/or processes, such as machine learning 320, may generate improved recommendations, such as recommendation 330, regarding reconsidering the user's sister's phone calls. For example, because machine learning 320 is provided with input indicating that the user's sister is currently not stressed and/or does not show signs of anger, machine learning may recommend that the user proceed to place the phone call, such as via recommendation parameters 330. Additionally, for example, warnings regarding placing a phone call to a user's sister may be retained for situations where the sister actually exhibits strain and/or anger. Of course, this is merely one example of how a state of an external individual may be detected and/or utilized and the scope of claimed subject matter is not limited in these respects.
Further, embodiments may include sharing behavior summary content (such as behavior summary content 240) and/or recommendations (such as recommendations 330) between groups of individuals. For example, individual members of an individual group may have personal computing devices, such as mobile device 100, which may include sensors, such as sensor 150, camera 160, and/or microphone 170, for example, and which may also include a behavior processing unit, such as BPU 200. In an embodiment, the behavior profile content, such as behavior profile content 240, and/or recommendations, such as recommendation 330, of individual members of the group may be shared within the group. The behavioral profile content (such as behavioral profile content 240) and/or recommendations (such as recommendations 330) for a particular individual may be transmitted, for example, to the personal computing devices of one or more other members of the group via signal packets in a cellular network, a wireless local area network, and/or any of a variety of other wired and/or wireless communication techniques. For example, friends and/or family may be linked via their respective personal computing devices (e.g., mobile devices 100), e.g., for purposes of collaborative health management, such that their respective state vectors (e.g., behavior summary content 240 and/or recommendations 330) may be used as input for utilization by a behavior processing unit (such as BPU 200), and/or by a decision-making device, system, and/or process (such as decision-making system 250), and/or by a machine learning device, system, and/or process (such as machine learning 320).
By sharing behavioral summary content and/or recommendations, at least in part, among individuals of a group of individuals, more complex patterns and/or interactions between particular individuals of the group can be monitored and/or tracked, thereby improving the quality of recommendations and/or recommendations, such as recommendation 330. For example, if two parents and a child are linked, embodiments may recognize that any pattern that is not a child results in the child entering an undesirable mental state, but instead results from a pattern associated with one or both of the parents. Embodiments in which behavioral summary content and/or recommendations may be shared among various groups of individuals, such as within a household, for example, may be utilized to avoid and/or alert to violence and/or abuse, such as household violence and/or abuse.
For example, the user and important others of the user (e.g., spouse, family partner, boyfriend, girlfriend, etc.) may each have a respective personal computing device, such as mobile device 100. In an embodiment, a user's personal computing device and a significant others' personal computing device may share behavior summary content and/or recommendations, at least in part, between the two devices. That is, two personal computing devices may be "linked". In an embodiment, a user's personal computing device, such as mobile device 100, may prefer a change in position to avoid potential home abuse, such as via visual content presented to the user via display 140 and/or via audio output. In an embodiment, such recommendations may be based at least in part on behavior profile content (such as behavior profile content 240) and/or recommendations (such as recommendations 330) obtained from personal computing devices of important others.
In another embodiment, behavioral summary content (such as behavioral summary content 240), for example, provided by a behavioral processing unit (such as behavioral processing unit 200) may be utilized to help manage physical and/or mental health of a user. For example, in an embodiment, the relationship between behavioral profile content related to eye movement and the balance of GABA and glutamate neurotransmitters can be tracked, evaluated, and/or learned (e.g., via machine learning). In some cases, the human body may attempt to balance GABA (a sedative neurotransmitter) with glutamate (an excitatory neurotransmitter). However, in some cases, the balance between GABA and glutamic acid may not be performed immediately and/or perfectly, for example. For example, mutations in genes such as GAD1, the availability of cofactors such as vitamin B6, or an increased number of glutamate receptor sites due to excess glutamate in the developing brain can adversely affect GABA/glutamate balance. Glutamic acid may be involved in learning language processing, sleep, and/or mood, to name a few.
In some cases, glutamic acid may be manipulated by certain processed foods, for example, and may not be measured and/or labeled. An excess of glutamate, for example, can result, which can adversely affect the child because the brain is developing. To help address this issue, the diet may be restricted and general GABA supplementation may be recommended. However, frequent laboratory testing can be costly. Additionally, because the GABA/glutamate balance may change relatively frequently and/or continuously, it may not be possible to perform laboratory tests sufficiently frequently to track such changes. For example, parents of autistic children may resort to behavioral monitoring in situations where the parents may be able to recognize more exaggerated behaviors such as shouting, biting, self-injury, and the like. Embodiments that can more easily, efficiently, and/or non-invasively recognize a GABA/glutamate imbalance, for example, can be advantageously utilized to address problems associated with GABA/glutamate imbalance. Additionally, for example, embodiments may support the recognition of the GABA/glutamate imbalance before behaviors such as shouting, biting, self-injury, etc. become apparent.
In an embodiment, machine learning, such as machine learning 320, may identify changes in GABA/glutamate balance based at least in part via tracking of behavioral profile indicative of eye movement including parameters indicative of blink duration, eye fast movement, blink rate, focusing ability, pupil dilation, and the like. In an embodiment, the camera may track the eye behavior of a particular user approximately continuously. Content related to the need for and/or effect of GABA supplementation can also be tracked. In an embodiment, the apparatus, system, and/or process can learn, such as via machine learning techniques, to identify a change in GABA/glutamate balance based at least in part on eye movement parameters and/or based at least in part on content related to the need for and/or effect of GABA supplementation, such as scientific publication content 315. Additionally, in an embodiment, a recommendation regarding GABA supplementation may be generated for a particular user based at least in part on the identified GABA/glutamate balance change, such as via machine learning 320. Embodiments may, for example, help ensure that an improved beneficial GABA/glutamate balance can be produced in health and/or social competence, such as for autistic children in key stages of development. In addition, embodiments may be advantageously utilized, for example, with children, students, those with sleep disorders, and/or those with anxiety disorders.
In another embodiment, the relationship between behavioral summary content (such as behavioral summary content associated with eye movement) and bioavailability levels of 5-methyltetrahydrofolate (5-MTHF) can be tracked, evaluated, and/or learned, such as via machine learning 320. In some cases, the human body can convert dietary and/or supplemental folate into 5-MTHF for use in critical biological processes (e.g., detoxification, epigenetic function, and/or neurotransmitter balance, etc.). About 9-11% of the general population, and about 98% of the autism population, may have gene mutations that affect the conversion of one form of folate to another. Due at least in part to the relatively higher consequences of sub-optimal levels of 5-MTHF, it may be beneficial to readily and/or efficiently identify changes in bioavailable 5-MTHF.
In an embodiment, machine learning, such as machine learning 320, may identify changes in bioavailable 5-MTHF in substantially real time (i.e., almost as it occurs) by monitoring behavioral indicators in eye movement. For example, behavioral profile parameters of a particular user, e.g., related to eye-fast movement, blink rate, focusing ability, and/or pupil dilation, may be tracked. Content related to the need for 5-MTHF replenishment and/or the effect of 5-MTHF replenishment may also be tracked. In an embodiment, the device, system, and/or process may learn, such as via machine learning techniques, to identify changes in 5-MTHF bioavailability based at least in part on eye movement parameters and/or based at least in part on content related to the need for and/or effect of 5-MTHF supplementation, such as scientific publication content 315.
Additionally, in an embodiment, a recommendation regarding 5-MTHF supplementation may be generated for a particular user based at least in part on the identified 5-MTHF bioavailability change, such as via machine learning 320. In embodiments, changes in the bioavailability of 5-MTHF may be identified before an individual perceives that the change has occurred, thereby helping to reduce negative consequences. In an embodiment, the identified change in bioavailability of the 5-MTHF and/or the recommendation for supplementation of the 5-MTHF may be communicated to a caregiver of the particular user. This may be advantageous, for example, in the following situations: the affected individual may not be able to communicate or understand the change, as may be the case in autism. Additionally, embodiments may help address, at least in part, the infeasibility of continuous laboratory testing, and/or may help parents of autistic children, for example, avoid behavior with relatively higher consequences in advance (e.g., shouting, biting, self-injuring, etc.). Parents of autistic children, for example, may perform behavioral monitoring, but may tend to notice relatively more extreme behaviors. Embodiments may allow for the association of more subtle behaviors such as eye movements, voice characteristics, etc. that may occur before more consequential behaviors emerge. In addition, embodiments may provide for early identification of the need for replenishment.
In additional embodiments, machine learning, such as machine learning 320, may identify changes in bioavailable 5-MTHF in substantially real time by monitoring behavioral profile parameters related to speech and/or voice. For example, behavioral profile parameters of a particular user, e.g., related to voice pitch, emotion analysis, volume, frequency, pitch, timbre, and so forth, may be tracked. In one embodiment, a microphone may be utilized, for example, to track behavioral profile parameters related to speech and/or voice on a near continuous basis. In an embodiment, the user may be provided to read aloud with a given passage and/or may be provided to answer with subjective questions, such as "how do you get today? ". Speech content collected from users reading and/or answering in this manner can be utilized, at least in part, to track speech and/or voice parameters and/or identify changes in 5-MTHF bioavailability. In another embodiment, voice characteristics may be monitored via a microphone that is substantially always on. Such an alternative may be of particular value to individuals who are unwilling and/or unable to verbally interact in an unambiguous manner, such as may be the case in the case of autism.
As described above, content related to the need for 5-MTHF replenishment and/or the effect of 5-MTHF replenishment may also be tracked. In an embodiment, a device, system, and/or process may learn, such as via machine learning techniques, to identify changes in 5-MTHF bioavailability based at least in part on speech and/or voice related parameters and/or based at least in part on content related to a need for 5-MTHF supplementation and/or an effect of 5-MTHF supplementation, such as scientific publication content 315. Additionally, as described above, recommendations for 5-MTHF supplementation may be generated for a particular user based at least in part on the identified 5-MTHF bioavailability change, such as via machine learning 320.
In another embodiment, machine learning, such as machine learning 320, may be used, at least in part, to identify glutamate/GABA maneuvers. For example, glutamate/GABA manipulations may be identified based at least in part on behavioral profile content of a particular user and/or based at least in part on location content such as may be provided, for example, via one or more satellite positioning systems (such as GPS). As mentioned above, glutamate is an excitatory neurotransmitter. For this reason, in some cases, food science may focus on increasing excitement by increasing glutamate (e.g., monosodium glutamate). Information about the level of glutamic acid in a food product can be difficult to obtain. The food label may, for example, record "flavors", "spices" and/or "natural spices" without actual measurement of glutamic acid. In addition, as noted above, possible adverse consequences associated at least in part with excessive glutamate levels may include, for example, sleep disorders, anxiety, and/or inattention, and may have relatively higher adverse consequences in the developing brain (e.g., an increase in the number of glutamate receptor sites).
In an embodiment, a behavior processing unit, such as BPU 200, and/or a machine learning device, system, and/or process, such as machine learning 320, can identify locations where glutamate/GABA balance may be manipulated. Content indicating the location where the glutamate/GABA balance is likely to be manipulated can be communicated to the individual, allowing the individual and/or caregiver to avoid such location. Avoiding hyperexcitability in this way may be helpful to those with anxiety. The glutamate/GABA balance may also be generally beneficial to those attempting to learn novelty (e.g., students) and/or those who may be sensitive (e.g., those with difficulty sleeping, autistic children, etc.). For example, a behavior processing unit, such as BPU 200, and/or a machine learning device, system, and/or process, such as machine learning 320, may identify a change in glutamate/GABA balance for one or more individuals, for example, over a period of time, and may track a parameter indicating where the change in balance occurred. In an embodiment, at least in part in response to identifying a change in the glutamate/GABA balance for a particular individual, the location of the particular individual may be determined and a parameter indicative of the location may be stored in a database. Over time, databases of glutamate/GABA balance content and associated location content may be developed. Content from the glutamate/GABA and/or location database can be utilized, at least in part, to identify the particular location where the glutamate/GABA imbalance is most likely to occur. For example, in one embodiment, a particular restaurant may be identified, allowing the user and/or caregiver to avoid such a location. Embodiments may provide a relatively efficient and/or non-invasive technique for identifying glutamate manipulations.
To collect location content associated with glutamate/GABA balance changes, an individual may wear a bracelet and/or other wearable device that can track the location and/or the time spent at a location. In an embodiment, the wearable device may include, for example, a GPS receiver and/or a terrestrial wireless network transmitter. Additionally, in an embodiment, a machine learning apparatus, system, and/or process, such as machine learning 320, may be trained at least in part via a user, such as a parent and/or a caregiver, for example, providing input, such as user input 311, indicating evidence of a particular user's hyperactivity (e.g., inattention). In an embodiment, patterns and/or possible sources (e.g., locations) thereof associated with relatively higher glutamate states may be learned at least in part via machine learning devices, systems, and/or processes, such as machine learning 320. For example, in the case of a sports drink where a football coach can provide relatively higher glutamic acid in a football exercise, the source of glutamic acid that would otherwise be difficult to identify can be identified to the parents. In another embodiment, eye behavior may be tracked to identify, at least in part, relatively higher glutamate states instead of and/or in addition to user-provided content identifying particular behavioral states.
Additionally, in an embodiment, content obtained from wearable sensors, sensors from mobile devices, and/or other sensors may be collected and/or stored for one or more users over a period of time to at least partially identify behavioral patterns that may be associated with depressive symptoms. In an embodiment, a wearable device, phone, and/or other device may include a behavior processing unit, such as BPU 200, and/or may include one or more devices, systems, and/or processes for machine learning, such as 320, to generate and/or analyze behavior summary content and/or generate location content.
Embodiments may also include a device, system, and/or process for machine learning, such as machine learning 320, to identify, at least in part, changes in hormonal balance (e.g., testosterone, estrogen, progestin, etc.) in substantially real time based, at least in part, on monitored and/or tracked environmental noise. Scientific publications have identified, for example, associations between hormone levels and brain function, as well as associations between hormone levels and the ability to perform tasks involving fine motor skills. In an embodiment, patterns of mild injury indicative of hormonal imbalance may be identified through ambient sound monitoring, such as via a microphone that may be turned on substantially and/or at all times in a significant sense. In an embodiment, the microphone may be provided, for example, by a mobile device. Damage indicative of hormonal imbalance may be of relatively low consequence on an individual, and therefore it may be difficult for an individual to make inferences and/or conclusions in isolation. However, such impairments, such as may be identified at least in part via the microphone sensor content, may be identified as a group, for example, due to hormonal imbalance. For example, the sounds associated with dinner every night may vary with hormonal balance. Sounds associated with hormonal imbalances include, for example, indications of awkwardness, bizarre movements, dropping things, minor burns while cooking, and the like.
In an embodiment, a microphone may be utilized, at least in part, to substantially continuously track ambient sounds associated with a particular user. Content related to the need for and/or effect of supplementation, such as parameters 315 obtained from scientific journals, may also be tracked. In an embodiment, an apparatus, system, and/or process may learn, such as via machine learning techniques, to identify a need for replenishment based at least in part on monitored environmental sounds and/or based at least in part on content obtained from a scientific journal and/or other scientific sources. In an embodiment, the embodiment may also recommend supplementation of one or more specific substances to the user based at least in part on the identified change in hormonal balance.
FIG. 4 is an illustration of an embodiment 400 of an example apparatus, system, and/or process for processing signals and/or states representing behavioral summary content. In an embodiment, machine learning and/or artificial intelligence and/or other techniques, such as machine learning 440, may detect "silence favorites" and/or may generate one or more parameters indicating silence favorites, such as silence favorites parameter 450. As described above, a silent like refers to an at least partially non-explicit indication of approval, appreciation, etc., of content consumed by a particular user, such as user 410. For example, one or more sensors may detect one or more behavioral and/or biological aspects of a particular user (e.g., nodding of a head, dilation of a pupil indicating dopamine release, etc.), which may be understood to indicate approval, appreciation, etc., of content consumed by that particular user. Other biological and/or behavioral aspects that may indicate a particular user that enjoys currently consumed digital content may include, by way of non-limiting example, head movements, voice volume, heart rate, respiration, sweat, blood pressure, eye movements, blink rate, pupil dilation, and so forth. In an embodiment, substantially understood relationships, such as dopamine release and/or pupil dilation, and/or relatively recent and/or substantially currently determined and/or learned relationships may be utilized, at least in part, to determine a silent favor.
In an embodiment, machine learning and/or artificial intelligence and/or other techniques, such as machine learning 440, may obtain behavior summary content, such as behavior summary content 420, which may include one or more parameters indicative of a substantially current behavior and/or biological state of a particular user. Additionally, machine learning and/or artificial intelligence and/or other techniques, such as machine learning 440, may obtain parameters that represent content that is substantially currently consumed by a particular user, such as user 410. For example, a user, such as user 410, may view a movie available from a streaming media service. The content identifying the movie, such as content consumption parameters 415, for example, may be provided to machine learning and/or artificial intelligence and/or other techniques, such as machine learning 440. In an embodiment, machine learning and/or artificial intelligence and/or other techniques, such as machine learning 440, may process behavioral summary content, such as behavioral summary content 420, and/or content consumption parameters, such as content consumption parameters 415, to determine whether a user, such as user 410, is indicating via some behavioral and/or biological aspect that a particular user approves and/or enjoys particular content (e.g., movies, music, video games, etc.) being consumed by the particular user. In an embodiment, a signal and/or status indicating a silent like of a particular user may be communicated to a content provider, for example. In this manner, the level of user acceptance and/or appreciation of particular content may be taken into account when recommending future content for a particular user. Example benefits of detecting "silent favorites" may include obtaining "favorites" from a user without requiring explicit user input that may interrupt the immersive experience. Without the ability to detect silent favorites, either explicit favorites will need to be obtained from the user, or no such indication from the user at all. Of course, the scope of the subject matter is not limited to these particular examples.
Additionally, in an embodiment, parameters indicative of determined and/or substantially known relationships, such as detected relationships parameter 430, may include parameters indicative of relationships between behavioral summary content and/or user states and/or may include parameters indicative of scientifically determined relationships. In an embodiment, a relationship between content that may be gathered from sensor output and a behavior and/or biological state of a user may be determined based at least in part on parameters representing one or more scientific publications, such as scientific publication parameters 432. In an embodiment, parameters representing other relationships, such as parameter 434, may be determined across multiple users.
To train a machine learning device, system, and/or process, such as device, system, and/or process 440, parameters indicating explicit "likes" and/or "dislikes," such as parameters 412 and/or 414, and/or content consumption parameters, such as consumption parameter 415, may be obtained from one or more users, such as particular user 410. A machine learning device, system, and/or process, such as machine learning device, system, and/or process 440, may associate parameters indicating an explicit "like" and/or "dislike" obtained with changes in behavior summary content (such as behavior summary content 420) to train one or more machine learning parameters. In an embodiment, one or more users may provide explicit likes and/or dislikes, for example, by appropriate selection of particular elements of a web page, such as via a click.
In an example embodiment, rather than basing such a determination on and/or in addition to explicit and/or limited "like" or "dislike" inputs obtained from a user, a digital content provider, such as a digital audio content provider Pandora and/or Spotify, for example, may select and/or evolve content for presentation to a particular user, such as user 410, based at least in part on behavioral summary content, such as behavioral summary content 420, obtained from a processor, such as behavior processing unit 200. For example, in an embodiment, a digital audio content provider can modify a playlist and/or streaming audio content for a particular user based at least in part on a parameter (such as 450) indicating a silence preference.
Although some of the embodiments described above discuss silent favorites, for example, in connection with consumption of content by a particular user, the scope of the subject matter is not limited in this respect. For example, physiological arousal may play a role in occupational therapy for children with autism and/or Attention Deficit Hyperactivity Disorder (ADHD). Embodiments of devices, systems, and/or processes, such as those described herein, for example, may detect a change in a child's electrodermal activity, for example, at least in part by obtaining a signal and/or status from a wearable sensor that may measure motion and/or one or more biological aspects.
Fig. 5 is an illustration of an embodiment 500 of a system comprising a processor, such as behavior processing unit 520, processing signals and/or states representing behavior content in a computing device. In one embodiment, to generate behavioral summary content, such as behavioral summary content 521, for a particular user, such as user 510, a processor, such as behavioral processing unit 520, may obtain signals and/or states representing the content from one or more sensors, such as one or more of sensors 540. Additionally, in an embodiment, a processor, such as behavior processing unit 520, may process sensor content, such as content from one or more of sensors 540, to generate behavior summary content, such as behavior summary content 521, for a particular user. In one embodiment, a processor, such as behavior processing unit 520, may include behavior content processing circuitry. For example, a processor, such as behavior processing unit 520, may include, in an embodiment, sensor content processing circuitry, such as circuitry 522, and/or may include machine learning circuitry, such as circuitry 524 and/or 526.
In an embodiment, a processor, such as behavior processing unit 520, may provide circuitry to generate behavior profile content, such as behavior profile content 521, at least in part for a particular user, such as user 510, to be utilized for any of a wide variety of possible applications and/or purposes. For example, a processor, such as behavior processing unit 520, can generate behavior profile content, such as behavior profile content 521, to determine, at least in part, "silent favorites," such as can be related to consumption of digital media by a particular user. In an embodiment, the behavior profile content, such as behavior profile content 521, can include one or more parameters indicating a silence preference, such as one or more silence preference parameters 530. Of course, this is merely one example of how behavior summary content (e.g., behavior summary content 521) generated by a processor (e.g., behavior processing unit 520) may be utilized and the scope of the subject matter is not limited in these respects.
In an embodiment, one or more sensors, such as sensor 540, may provide content indicative of various aspects of a biological and/or behavioral state of a particular operator, and/or indicative of one or more environmental factors and/or other external factors. In an embodiment, as previously described, the sensor 540 may include one or more sensors of one or more sensor types. Additionally, in an embodiment, a processor, such as behavior processing unit 520, may include circuitry, such as circuitry 522, to process content obtained from one or more sensors, such as sensor 540. In an embodiment, the content obtained from a sensor (such as sensor 540) may include digital signals and/or states, analog signals and/or states, or any combination of these. For example, the circuit 522 may include digital circuitry, analog circuitry, or a combination thereof. In one embodiment, sensor content processing circuitry, such as circuitry 522, may convert one or more analog signals to digital signals, although the scope of the subject matter is not limited in this respect. In an embodiment, a circuit, such as circuit 522, may process signals and/or conditions from one or more sensors (such as sensor 540) to combine, coordinate, normalize, amplify, filter, and/or otherwise condition signals and/or conditions from one or more sensors (such as sensor 540), although the scope of the subject matter is not limited in these respects.
Additionally, in an embodiment, a processor, such as behavior processing unit 520, may include circuitry for determining and/or selecting weighting parameters and/or for determining and/or selecting particular machine learning devices, systems, and/or processes. For example, the circuitry 524 may determine and/or select one or more particular machine learning techniques, such as one or more particular neural networks and/or include one or more weighting parameters, for use in machine learning operations, for example. In an embodiment, the determination and/or selection of weighting parameters and/or machine learning operations (including one or more neural networks) may be based, for example, at least in part on identifying content, such as parameters 515, of one or more aspects of the content (e.g., title, genre, content type, such as music, interactive games, video, etc.) consumed by a particular user, such as user 510. In an embodiment, parameters (such as parameters 515) indicative of one or more aspects of content consumed by one or more particular users (such as user 510) may be provided by one or more content providers (such as content provider 505). In an embodiment, a content provider, such as content provider 505, may comprise, for example, a digital video content provider, a digital audio content provider, a video game content provider, a virtual reality content provider, and so forth, although the scope of the subject matter is not limited in these respects.
In an embodiment, machine learning circuitry, such as machine learning circuitry 526, may process content, such as may be obtained from circuitry 522 and/or 524, at least in part, to determine, estimate and/or infer one or more parameters that represent, for example, a substantially current biological and/or behavioral state of a particular user. In an embodiment, a machine learning circuit, such as machine learning circuit 526, may generate a representation of a particular user's biometric and/or behavioral state, such as behavioral summary content 521, at least in part and/or with a contribution from output generation circuit 528. In an embodiment, behavioral summary content, such as 521, for example, may include parameters representing focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, president, silent love, regret/recognition error, hunger, grass rate/precision, comforting, or social engagement level, or any combination of these. In an embodiment, a processor, such as behavior processing unit 520, may repeatedly and/or substantially periodically obtain sensor content and/or repeatedly and/or substantially periodically generate behavior summary content, such as behavior summary content 521, for a particular user, such as user 510.
In an embodiment, a processor, such as behavior processing unit 520, may determine appropriate weights for various sensor combinations and/or for particular parameters (such as parameter 515) provided by one or more content providers (such as content provider 505), for example, during an offline training operation. In another embodiment, during online operation, for example, a set of inputs may be recorded and/or later used as training parameters. For example, a user, such as user 510, may provide explicit likes and/or dislikes, such as may be represented as parameter 513, for particular content provided and/or suggested by one or more content providers, such as content provider 505. Additionally, in an embodiment, the determined and/or substantially known relationships (such as represented by parameters 550) may include relationships between behavioral summary content and/or user states and/or may include scientifically determined relationships. For example, parameters (such as parameter 552) indicative of relationships between content that may be gleaned from sensor output and behavior and/or biological states of a user may be determined at least in part via one or more scientific publications. In an embodiment, parameters representing other relationships, such as parameter 554, may be determined, for example, across multiple users and/or across groups.
Although embodiment 500 is described as detecting a silent like, for example, the scope of the claimed subject matter is not limited in this respect. For example, the BPU 520 may be utilized to generate behavioral summary content and/or execute decision making and/or make recommendations for any of a wide variety of applications, such as the example applications described herein.
Fig. 6 is an illustration of an embodiment 600 of a system comprising a processor, such as behavior processing unit 620, processing signals and/or states representing behavior content in a computing device. In one embodiment, to generate behavioral summary content, such as behavioral summary content 621, for a particular user, such as user 610, a processor, such as behavioral processing unit 620, may obtain signals and/or states representing the content from one or more sensors, such as one or more of sensors 640. Additionally, in an embodiment, a processor, such as behavior processing unit 620, may process sensor content, such as content from one or more of sensors 640, to generate behavior summary content, such as behavior summary content 621, for a particular user. In one embodiment, a processor, such as behavior processing unit 620, may include behavior content processing circuitry. For example, a processor, such as behavior processing unit 620, in one embodiment may include sensor content processing circuitry, such as circuitry 622, and/or may include machine learning circuitry, such as circuitry 624 and/or 626. In an embodiment, a processor, such as BPU 620, may also obtain content from sensors, such as sensor 640, to track one or more environmental aspects (e.g., ambient sound, temperature, barometric pressure, altitude, location, etc.).
In an embodiment, a processor, such as behavior processing unit 620, may provide circuitry to generate behavior profile content, such as behavior profile content 621, at least in part for a particular user, such as user 610, to be utilized for any of a wide variety of possible applications and/or purposes. For example, a processor, such as behavior processing unit 620, may generate behavior summary content, such as behavior summary content 621, to, for example, at least partially identify an imbalance of a particular substance within a human body. In an embodiment, behavior summary content, such as behavior summary content 621, may include one or more parameters indicative of eye movement, speech and/or speech aspects, ambient sounds, and so forth. Of course, the scope of the subject matter is not limited in these respects.
In an embodiment, a processor, such as behavior processing unit 620, may include circuitry for determining and/or selecting weighting parameters and/or for determining and/or selecting a particular machine learning device, system, and/or process. For example, circuitry 624 may determine and/or select one or more particular machine learning techniques, such as one or more particular neural networks and/or include one or more weighting parameters, for use in machine learning operations, for example. In an embodiment, the determination and/or selection of weighting parameters and/or machine learning operations (including one or more neural networks) may be based, for example, at least in part on identifying content, such as parameters 643, of one or more aspects of a substance consumed by a particular user, such as user 610.
In an embodiment, machine learning circuitry, such as machine learning circuitry 626, may process content, such as may be obtained from circuitry 622 and/or 624, at least in part, to determine, estimate, and/or infer one or more parameters that represent, for example, a substantially current biological and/or behavioral state of a particular user. In an embodiment, a machine learning circuit, such as machine learning circuit 626, may generate a representation of a particular user's biometric and/or behavioral state, such as behavioral summary content 621, at least in part and/or with a contribution from output generation circuit 628. In an embodiment, behavioral summary content, such as 621, may include, for example, parameters representing focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, president, silent love, regret/recognition error, hunger, grass rate/precision, comorbidity, or social engagement level, or any combination of these. In an embodiment, a processor, such as behavior processing unit 620, may repeatedly and/or substantially periodically obtain sensor content and/or repeatedly and/or substantially periodically generate behavior summary content, such as behavior summary content 621, for a particular user, such as user 610. Additionally, as described above, behavioral summary content, such as behavioral summary content 621, may include one or more parameters indicating voice intonation, voice mood, volume, frequency, pitch, timbre, and so forth. Additionally, as also mentioned above, behavioral summary content, such as behavioral summary content 621, may include one or more parameters indicative of eye-quickness, blink rate, focusing ability, and/or pupil dilation, to name a few additional non-limiting examples.
In an embodiment, a processor, such as behavior processing unit 620, may determine appropriate weights for various sensor combinations and/or for particular parameters provided by one or more content providers, for example, during an offline training operation. In another embodiment, during online operation, for example, a set of inputs may be recorded and/or later used as training parameters. For example, a user, such as user 610, for example, may explicitly provide input related to replenishment and/or consumption of a particular substance and/or may provide input related to behavior indicative of hyperexcitability of a particular individual and/or indicative of other observed behavior. Additionally, in an embodiment, the determined and/or substantially known relationships (such as represented by parameters 650) may include relationships between behavioral summary content and/or user states and/or may include scientifically determined relationships. For example, parameters indicative of relationships between content that may be gleaned from sensor output and behavior and/or biological states of a user (such as parameters 652) may be determined at least in part via one or more scientific publications. In an embodiment, parameters representing other relationships, such as parameter 654, may be determined, for example, across multiple users and/or across groups.
Fig. 7 is a schematic block diagram depicting an embodiment 700 of an example apparatus (such as a behavior processing unit) for processing content (such as content obtained from sensors 730) in a computing device to generate signals and/or states representing behavior content. In an embodiment, a processor, such as behavior processing unit 700, may process digital signals and/or states or analog signals and/or states, or a combination thereof (e.g., a mixed signal). Any of a wide variety of digital and/or analog circuit types may be utilized to process digital, analog, and/or mixed signal signals and/or states, as explained more fully below. In an embodiment, one or more aspects of a processor (such as behavior processing unit 700) may be implemented to operate in the analog domain, while one or more other aspects may be implemented to operate in the digital domain. In other embodiments, a processor, such as behavior processing unit 700, may be implemented to operate substantially entirely in the digital domain and/or the analog domain.
In an embodiment, a processor, such as behavior processing unit 700, may generally obtain content from sensors (such as one or more sensors 730) substantially continuously and/or may generate output signals and/or states, such as behavior summary content 725, substantially continuously. The generated output signals and/or states, such as behavior summary content 725, may be provided to one or more decision-making systems, such as decision-making system 740, for example.
In an embodiment, a sensor parameter processing stage, such as sensor parameter processing stage 701, may obtain signals and/or states (e.g., digital, analog, and/or mixed signals) from one or more sensors, such as sensor 730. In an embodiment, a sensor parameter processing stage, such as sensor parameter processing stage 701, may process signals and/or states from one or more sensors, for example, at least in part by combining content, adjusting timing, performing noise reduction and/or other signal reduction operations, normalizing content, or any combination of these. However, the scope of the claimed subject matter is not limited in this respect.
Additionally, in an embodiment, sensor content manipulation circuitry, such as sensor content manipulation circuitry 717, may direct signals and/or states obtained from a sensor (such as from sensor 730) to one or more sensor processing units, such as one or more of Sensor Processing Units (SPUs) 705. In an embodiment, a sensor processing unit, such as SPU 705, may be configured via one or more control signals, such as control signals communicated between a control unit, such as control unit 703, and the sensor processing unit, such as SPU 705. A sensor processing unit, such as SPU 705, may prepare sensor content, such as signals and/or states obtained from one or more sensors (such as sensor 730), for further processing, for example, by a machine learning processing stage (such as machine learning processing stage 702). In an embodiment, sensor content manipulation circuitry, such as sensor content manipulation circuitry 717, may direct content, for example, based at least in part on one or more control signals obtained from a control unit, such as control unit 703, and/or from memory, such as memory 712.
In an embodiment, a sensor processing stage, such as sensor processing stage 701, may include one or more sensor processing units, such as sensor processing unit 705, which may be configured to operate alone or in one or more various combinations. A sensor processing unit, such as SPU 705, may perform any of a variety of operations individually and/or in concert that may be specified and/or implemented. Such operations may include, for example, combining signals and/or states, adjusting the timing of signals and/or states, performing noise reduction and/or other signal reduction operations, and/or normalizing content, to name a few examples.
One or more sensor processing units, such as SPU 705, may be implemented to operate in the analog domain and/or one or more units may be implemented to operate in the digital domain. In an embodiment, a sensor, such as sensor 730, may provide a signal and/or state that includes an analog signal and/or includes digital content (e.g., a signal and/or state). Additionally, in an embodiment, one or more analog signals obtained by one or more sensors (e.g., 730) may be converted to digital content using analog-to-digital conversion circuitry. In other embodiments, analog signals obtained from sensors (such as sensor 730) may be maintained as analog signals for processing by one or more sensor processing units (such as SPU 705), for example. Additionally, in an embodiment, an individual sensor processing unit, such as SPU 705, may be implemented analog and/or digitally based at least in part on the particular task to be performed by a particular SPU and/or based at least in part on the particular signal type to be obtained from a sensor, such as sensor 730. In one embodiment, one or more of the various filters, signal amplifiers and/or signal attenuation circuits, which may vary from relatively simpler to relatively more complex, may be implemented by one or more particular sensor processing units (such as SPU 705), for example. Sensor processing unit operations, such as the example operations mentioned herein, are of particular relevance in a larger context of behavioral summary content generation in connection with one or more machine learning units. For example, sensor processing unit operations may be performed in view of the ultimate goal of behavioral summary content generation.
In one embodiment, a particular sensor processing unit, such as SPU 705, may include, for example, noise reduction, filtering, attenuation, combining, amplification circuitry, and so forth, implemented to operate in the analog domain. The analog circuitry may, for example, include one or more operational amplifiers, transistors, capacitors, resistors, and/or the like, although the scope of the claimed subject matter is not limited in this respect. Circuits such as noise reduction, filtering, attenuation, combining, amplification circuits, etc. may also be implemented, for example, in the digital domain, or in a combination of analog and/or digital. As another example, a particular sensor processing unit, such as a particular SPU 705, may be implemented in analog and/or digital to combine signals and/or states. In an embodiment, the units that combine signals and/or states may be implemented in the analog domain or the digital domain, or a combination thereof. In an embodiment, an analog hysteresis "winner-takes-all" circuit, for example, may be implemented at least in part to improve noise robustness and/or at least partially mitigate timing differences between sensor input streams. Of course, the scope of the subject matter is not limited in these respects. Further, noise reduction, filtering, attenuation, combining, and/or amplification are merely example tasks that may be performed by one or more sensor processing units (such as SPU 705), and as such, the scope of the claimed subject matter is not limited in these respects.
Additionally, in an embodiment, a sensor processing unit, such as SPU 705, may be implemented to generate an output that may exhibit a range of approximations, inaccuracies, and/or irreproducibility. In an embodiment, a machine learning unit, such as ML 706, may help mitigate consequences that may otherwise occur due to approximations, inaccuracies, and/or irreproducibility that a sensor processing unit, such as SPU 705, may exhibit. As utilized herein, "reproducible" in the context of a sensor processing unit, such as SPU 705, refers to the ability to generate the same output for a given repeating set of inputs. "non-replicability" in this context means that one or more sensor processing units (such as SPU 705) do not necessarily generate the same output for a given repeating set of inputs. That is, in one embodiment, one or more sensor processing units, such as SPU 705, may be implemented in a manner that does not guarantee similar outputs for similar sets of inputs.
In one embodiment, content manipulation circuitry, such as content manipulation circuitry 718, may direct content (such as signals and/or states) generated by one or more sensor processing units, such as SPU 705, to a machine learning phase, such as machine learning phase 702. Content (such as signals and/or states 721) generated by one or more sensor processing units (such as SPU 705), for example, may also be stored at least temporarily in a memory (such as memory 716). In an embodiment, memory 716 may comprise, for example, a buffer, such as a first-in-first-out buffer, although the scope of the claimed subject matter is not limited in this respect. In an embodiment, content manipulation circuitry, such as content manipulation circuitry 718, may direct content, for example, based at least in part on one or more control signals obtained from a control unit, such as control unit 703 and/or from memory, such as memory 712.
A machine learning stage, such as machine learning stage 702, can include content manipulation circuitry, such as content manipulation circuitry 708, which can direct content (such as signals and/or states 721), obtained from a sensor processing stage, such as sensor processing stage 701, to one or more machine learning units (ML), such as machine learning unit 706, for example. In an embodiment, content manipulation circuitry, such as content manipulation circuitry 708, may direct content, such as signals and/or states 721, based at least in part on one or more control signals obtained from a control unit, such as control unit 703, and/or from memory, such as memory 713.
In an embodiment, a machine learning unit, such as machine learning unit 706, may be configured via one or more control signals, such as control signals communicated between a control unit, such as control unit 703, and a machine learning unit, such as machine learning unit 706. In an embodiment, one or more machine learning units, such as machine learning unit 706, may be configured to operate alone or in a combination with one or more other machine learning units. In an embodiment, individual machine learning units, such as machine learning unit 706, may implement particular machine learning techniques. Additionally, one or more machine learning units, such as machine learning unit 706, may be implemented to operate in the analog domain or the digital domain, or a combination thereof. For example, a machine learning unit operating in the analog domain may include voltage and/or current summing circuits to sum several signals and/or states, and/or may include devices that may apply weighting factors to individual signals and/or states, such as variable impedance devices. Of course, the scope of the claimed subject matter is not limited in these respects.
Content manipulation/selection circuitry, such as content manipulation/selection circuitry 707, may, in one embodiment, select and/or combine content generated by one or more machine learning units, such as machine learning unit 706. Additionally, content manipulation/selection circuitry, such as content manipulation/selection circuitry 707, may direct outputs, such as signals and/or states representing behavior summary content 725 to a decision-making system, such as decision-making system 740. In an embodiment, a control unit, such as control unit 703, may obtain at least a portion of the output generated by a machine learning unit, such as machine learning unit 706.
In an embodiment, a control unit, such as control unit 703, may configure and/or control one or more aspects of behavior processing unit 700. In an embodiment, a control unit, such as control unit 703, may obtain inputs from a variety of sources and/or may control various aspects of behavior processing unit 700 based at least in part on the obtained inputs. In an embodiment, the control unit inputs may be obtained from units within the behavior processing unit 700 itself and/or from one or more other sources. For example, the control unit 703 may obtain user parameters 715 (e.g., a user ID or other parameters describing a particular user). In an embodiment, user parameters, such as parameter 715, may be obtained from one or more external sources and/or may be obtained from one or more memories within behavior processing unit 700. For example, user parameters for one or more particular users may be stored in a memory (such as memory 704). Various aspects of the behavior processing unit 700 may be configured and/or reconfigured based at least in part on parameters that may be stored in memory (such as memory 704) on an individual user basis. For example, a control unit, such as control unit 703, may be in communication with a memory, such as memory 704, to obtain configuration content for a particular user from memory 704, and/or may configure behavior processing unit 700 based at least in part on the obtained configuration content. Additionally, in an embodiment, a control unit, such as control unit 703, may obtain content from a decision-making system, such as decision-making system 740, or from one or more external sources, such as external system 750.
Although the example behavior processing unit 700 is depicted as having particular memory devices, such as memories 704, 712, 713, and/or 716, other embodiments may include memory elements distributed in various regions of the processing unit. For example, memory elements may be included in one or more sensor processing units 705 and/or one or more machine learning units 706. Further, a memory, such as memory 704, may be implemented as a hierarchy of memory devices and/or technologies that may allow for various sizes and/or memory access speeds. Additionally, a memory, such as memory 704, for example, may store machine learning weighting parameters and/or other machine learning parameters, and/or may also store control signals.
In an embodiment, a control unit, such as control unit 703, may generate one or more output signals and/or states, such as one or more control signals, based at least in part on inputs obtained by the control unit. The control signal output generation may be a function of one or more inputs that may include, for example, a user identification parameter, a content type parameter, a context parameter, a task parameter, a sensor availability parameter, or a behavioral summary content designation parameter, or any combination of these. Of course, these are merely example types of inputs that may be obtained by a control unit (such as control unit 703), and the scope of the claimed subject matter is not limited to these particular examples.
As described above, a control unit, such as control unit 703, may obtain user parameters 715 that may include user identifying content and/or other parameters describing a particular user. Additionally, in an embodiment, a control unit, such as control unit 703, for example, may obtain parameters describing content being consumed by the user (e.g., music, movies, games, digital books, etc.), parameters describing tasks being performed by the user, or parameters describing a context and/or environment, or any combination of these. In an embodiment, context and/or environment parameters 711 may be provided by and/or obtained from an external system (such as external system 750). Additionally, in an embodiment, content and/or task parameters 710 may be provided by and/or obtained from a decision-making system (such as decision-making system 740). For example, a parameter describing a type of content may indicate that a user is listening to music and/or otherwise consuming music, rather than participating in an interactive game. Additionally, for example, parameters describing the user/operator and/or the task may indicate the type of task being performed (e.g., flying, driving, performing surgery, etc.), and/or may indicate a particular user/operator. Additionally, for example, parameters describing a context and/or environment may indicate a particular context (e.g., location, time period, date, etc.), the presence of other individuals, or other contextual information, or any combination of these.
A control unit, such as control unit 703, for example, may also obtain a parameter, such as parameter 714, that may indicate sensor availability. Furthermore, a control unit, such as the control unit 703, may for example obtain parameters, such as the parameters 719, which may indicate one or more specific parameters and/or parameter types of the behavior profile (such as the behavior profile 725) to be generated by the machine learning stage 702, e.g. relatively preferentially. Additionally, one or more parameters 720 representing one or more aspects of the behavioral summary content 725 to be generated by the machine learning stage 702 may be provided to and/or obtained by a control unit (such as the control unit 703). For example, in one embodiment, the parameters 720 may include feedback to the control unit 703 that may affect the behavior processing unit operation.
As described above, a control unit, such as control unit 703, may generate one or more control signals based at least in part on input available from any one of a range of sources. For example, input obtained by control unit 703 may allow particular content from one or more memory elements (such as one or more of memories 704, 712, 713, and/or 716) to be selected for utilization in configuring sensor processing stage 701 and/or machine learning stage 702 for processing. For example, sensor processing stage 701 and/or machine learning stage 702 may be configured based on a particular user/operator, a particular task, or a particular context, or a combination thereof. By adjusting the processing in this manner, improved behavioral summary content may be generated, and/or efficiency (e.g., improved confidence in behavioral summary content while utilizing relatively fewer resources) may be improved. Additionally, in an embodiment, the control unit 703 may manipulate the output of the sensor processing stage 701 (e.g., the intermediate results) to a particular machine learning unit 705 via control of the manipulation circuit 708 based at least in part on the input obtained by the control unit 703. Similarly, the control unit 703 may select outputs from one or more particular machine learning units 706 via control of the manipulation/selection circuitry 707 based at least in part on the obtained inputs. Additionally, a weighting of the input to the machine learning unit 706 may be determined based at least in part on the obtained input. For example, a control unit, such as control unit 703, may in an embodiment manipulate, select, and/or weight intermediate results (e.g., content generated by sensor processing stage 701) based on user/operator identification, content type, environmental context, or sensor availability, or any combination of these. Of course, the scope of the claimed subject matter is not limited in these respects.
Additionally, in an embodiment, resource allocation within a processor (such as behavior processing unit 700) may specify parameters, such as parameter 719, based at least in part on behavior profile content. In an embodiment, a control unit, such as control unit 703, may obtain behavior profile specifying parameters 719, which may indicate, for example, a relative priority of one or more behavior profile parameters, and may select a particular sensor processing unit 705 and/or a particular machine learning unit 706 based at least in part on the specified behavior profile specifying parameters. "relative priority" in the context of behavioral summary content specifying parameters, such as parameter 719, refers to one or more particular parameters being prioritized relative to other parameters. For example, the behavioral profile specifying parameter 719 may indicate an "anger" parameter. Resources (e.g., SPU 705, machine learning unit 706, memory, etc.) sufficient, for example, to handle the "anger" parameter to a certain confidence level may be allocated, even at the expense of resources that may otherwise be allocated to generate other behavioral profile content parameters. The control unit 703 may select resources from the sensor processing stage 701 and/or the machine learning stage 702 via one or more control signals to generate behavioral summary content according to specified parameters. In this manner, relatively prioritized content may be generated relatively more efficiently. The behavioral profile specifying parameters, such as parameter 719, may also indicate, in one embodiment, a relative priority with respect to a tradeoff between power consumption and generation of specific behavioral profile content. Additionally, in an embodiment, relatively prioritized content may be generated at the expense of other behavioral summary content. For example, behavioral profile parameters indicative of anger and/or fatigue may be relatively prioritized relative to excitement and/or hunger parameters, and control unit 703 may configure sensor processing stage 701 and/or machine learning stage 702 accordingly. Additionally, in an embodiment, self-feedback and/or output monitoring content, such as content 720, may allow for control adjustments, such as selecting additional/different machine learning units and/or sensor processing units and/or otherwise adjusting resource utilization within behavior processing unit 700. Such adjustments may be made, for example, to meet specified relative priorities, specified confidence levels in the generated output, and so forth.
Although some embodiments described herein refer to neural network techniques for machine learning, the scope of the subject matter is not limited in this respect. Other embodiments may incorporate other machine learning techniques currently existing or to be developed in the future. Additionally, for embodiments implementing a neural network, sensors may be removed from the system during offline pre-deployment training operations, for example, so that the neural network may determine appropriate weights for various sensor combinations. In another embodiment, during online operation, for example, a set of input biomarkers may be recorded and/or later used as training parameters, where predicted behavior processing unit outputs may be utilized, at least in part, to train one or more networks that may lack some subset of the initial inputs. For online inference, an appropriate neural network may be selected based at least in part on available sensor inputs. Such an arrangement may be advantageous in situations where an operator may remove one or more sensors from the system, device, and/or process. For example, during surgery, the surgeon may remove his or her glasses that may be tracking eye movements. In an embodiment, for example, different neural network configurations may be selected at least partially in response to such changes in available sensor inputs. For example, a control unit, such as control unit 703, may detect a change in sensor availability (e.g., represented by sensor availability input 714), and/or may reconfigure sensor processing unit 705 and/or machine learning unit 706 based at least in part on the detected change in sensor availability.
FIG. 8 is an illustration of an embodiment 800 of an example process for generating behavioral summary content. Embodiments in accordance with claimed subject matter may include all of blocks 810 and 870, less than blocks 810 and 870, and/or more than blocks 810 and 870. Additionally, the order of blocks 810-870 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
In an embodiment, a behavior processing unit (such as behavior processing unit 700) may obtain one or more parameters describing a particular user/operator (e.g., a user ID), one or more parameters describing content consumed by the user, or one or more parameters indicating sensor availability, or any combination of these. In an embodiment, a behavior processing unit, such as behavior processing unit 700, may configure and/or reconfigure one or more aspects of the behavior processing unit based at least in part on the obtained parameters. For example, beginning at block 810, an example process for configuring a behavior processing unit (such as behavior processing unit 700) is described.
As indicated at block 815, it may be determined whether a particular user/operator is specified. As described above, a particular user may be identified via one or more descriptive parameters (such as parameter 715) obtained by a control unit (such as control unit 703). At least in part in response to determining that no particular user/operator is specified, default configuration parameters may be utilized, as indicated at block 825. In response, at least in part, to a determination that a particular user/operator is specified, a particular memory offset parameter for the particular user/operator may be obtained. For example, a control unit, such as control unit 703, may obtain from memory an offset amount specified for the identified user/operator.
In addition, as indicated at block 830, it may be determined whether a particular content type (e.g., the type of content being consumed by a particular user) is specified. In an embodiment, the particular content type may be identified via one or more parameters (such as parameter 709) obtained by a control unit (such as control unit 703). In response, at least in part, to determining that a particular content type is not specified, default configuration parameters related to the content type may be utilized, as indicated at block 835. In response, at least in part, to determining that a particular content type is specified, a particular memory offset parameter for the particular content type may be obtained. For example, a control unit, such as control unit 703, may obtain from memory an offset specified for the identified content type.
In an embodiment, one or more parameters indicative of sensor availability may be specified. For example, a control unit, such as control unit 703, may obtain one or more parameters 714 that indicate sensor availability. In an embodiment, a memory offset parameter may be obtained based on the indicated sensor availability. In an embodiment, the availability and/or unavailability of particular sensors and/or sensor types may result in different behavior processing unit configurations.
As indicated at block 860, for example, a control unit, such as control unit 703, may perform a read operation from one or more memory locations indicated by one or more offset parameters obtained in response to obtaining a user-specific parameter, a content type parameter, and/or a sensor availability parameter to obtain configuration parameters for a behavior processing unit (such as 700). In an embodiment, one or more control signals may be generated, such as indicated at block 870, to at least partially configure one or more aspects of a behavior processing unit (such as 700). For example, based at least in part on the generated control signals, one or more particular sensor processing units, such as 705, and/or one or more particular machine learning units, such as 706, may be selected for sensor content processing and/or behavior profile content generation operations. Additionally, for example, the control signals may direct sensor content to a particular sensor processing unit via the content manipulation circuitry, may direct intermediate results generated by the sensor processing unit to a particular machine learning unit, and/or may select an output of a particular machine learning unit.
FIG. 9 is an illustration of an embodiment 900 of an example process for generating behavioral summary content. Embodiments in accordance with the claimed subject matter may include all of blocks 910 + 940, less than blocks 910 + 940, and/or more than blocks 910 + 940. Additionally, the order of blocks 910 and 940 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
In an embodiment, one or more parameters indicative of one or more particular output parameters (e.g., prioritized behavior profile content parameters) may be obtained by a behavior processing unit, such as behavior processing unit 700. In an embodiment, a behavior processing unit, such as 700, may configure and/or reconfigure one or more aspects of the behavior processing unit based, at least in part, on the obtained parameters. For example, beginning at block 910, an example process for configuring a behavior processing unit (such as behavior processing unit 700) is described.
As indicated at block 920, it may be determined whether a particular output parameter relative priority is specified. As described above, a particular behavioral context parameter may be designated as relatively prioritized, e.g., via one or more parameters (such as parameter 719) obtained by a control unit (such as control unit 703). Additionally, at least in part in response to determining that a relatively prioritized output parameter is specified, it may additionally be determined whether the relatively prioritized output parameter fails to meet or exceed a specified confidence parameter, as indicated at block 930.
In an embodiment, as indicated at block 940, for example, additional resources, such as additional and/or different sensor processing units 705, additional and/or different machine learning units 706, or any combination of these, may be allocated to processing sensor content and/or generating specified behavioral summary content. In an embodiment, one or more control signals may be generated to at least partially reconfigure one or more aspects of a behavior processing unit, such as behavior processing unit 700. For example, based at least in part on the generated control signals, one or more particular additional and/or different sensor processing units, such as sensor processing unit 705, and/or one or more particular additional and/or different machine learning units, such as machine learning unit 706, may be assigned for sensor content processing and/or behavior profile generation.
FIG. 10 is an illustration of an embodiment 1000 of an example apparatus, system, and/or process for processing signals and/or states representative of behavioral content to transfer responsibility for control of one or more aspects of a particular machine from a particular operator to an automated apparatus, system, and/or process. In some cases, the transition may occur from a human-directed machine to human-machine cooperation with a relatively large number of decisions made to be performed by the machine. Some machine-based decisions may depend, at least in part, on the operator's state and/or the ability to respond substantially immediately and/or relatively quickly to changes in the operator's state. Moreover, more and more sensors, such as sensor 1030, may be capable of generating signals and/or states representative of various aspects of the operator's biological and/or behavioral state. Embodiments, include a behavior processing unit 1010, for example, that can coordinate content generated by sensors (such as sensor 1030) and/or can accelerate processing of sensor content to provide relatively quickly a relatively feature-rich and/or normalized state (e.g., behavior summary parameters 1015) of an operator to one or more external decision-making systems, such as decision-making system 1020. In an embodiment, by integrating and/or isolating processing of sensor content into a specialized device (such as BPU 1010), for example, the availability of resources sufficient to determine the behavior and/or biological state of an operator may be improved and/or ensured. Additionally, embodiments may provide, for example, accelerated processing of real-time biosensor content and/or machine learning for inferring behavior and/or biological state of an operator from processed sensor content.
Some embodiments may be relevant to any field that may benefit from substantially real-time determination of operator status, including for example, technology-assisted driving and/or flight in military and/or commercial scenarios. Embodiments may be particularly relevant in situations where the allocation of responsibility may be transferred based at least in part on operator status. For example, to generate behavior summary content, such as behavior summary content 1015, for a particular operator, such as operator 1040, a processor, such as behavior processing unit 1010, may obtain signals and/or states representing the content from one or more sensors, such as one or more of sensors 1030. Additionally, in an embodiment, a processor, such as behavior processing unit 1010, may process sensor content, such as content from one or more of sensors 1030, to generate behavior summary content, such as behavior summary content 1015, for a particular operator, such as operator 1040.
In an embodiment, a computing device-based decision making system, device, and/or process, such as system 1020, for example, may include machine learning and/or artificial intelligence techniques. Behavioral summary content, such as behavioral summary content 1015, may include a number of parameters representing focus, excitement, anger, fear, fatigue, dehydration, confusion, and/or concentration/distraction, for example. In an embodiment, various aspects of the behavior summary content may include vectors of parameters alone. For example, an "anger" vector may include, in one embodiment, a parameter indicating a particular state, a score associated with the state, a confidence parameter, and/or a direction and/or trend parameter. Additionally, in an embodiment, a decision-making system, device, and/or process based on a computing device, such as system 1020, may provide calibration and/or hint parameters, such as parameter 1025, to a processor, such as behavior processing unit 1010, although, again, the scope of the subject matter is not limited in these respects.
Additionally, in an embodiment, a processor, such as behavior processing unit 1010, may repeatedly and/or continuously obtain sensor content and/or may repeatedly and/or continuously generate behavior summary content for a particular operator. For example, sensor content may be collected and/or otherwise obtained at regular intervals, and/or behavior summary content may be generated at regular intervals. In an embodiment, a computing device based decision making system, device, and/or process, such as system 1020, may track behavioral summary content, for example, over a period of time, to detect changes in the behavioral summary content.
As described above, embodiments may include processing of signals and/or states representing sensor content. In at least some embodiments, the sensor content can include analog signals and/or digital signals, or a combination thereof. Additionally, although digital processing circuits may be described in connection with various example embodiments, the subject matter is not limited to digital implementations. For example, embodiments may implement analog circuitry for processing sensor content. Similarly, signals and/or conditions that may be generated to control the operation of a machine may include, for example, digital and/or analog signals and/or conditions, or a combination thereof. In an embodiment, an analog hysteresis "winner-takes-all" circuit, for example, may be implemented at least in part to improve noise robustness and/or at least partially mitigate timing differences between sensor input streams. Of course, the scope of the subject matter is not limited in these respects.
As described above, embodiments may be utilized in situations (e.g., including, for example, technology-assisted driving and/or flight in commercial and/or military scenarios) that may benefit from substantially real-time determinations of biological and/or behavioral states of a particular operator. Of course, the subject matter is not limited to these particular examples. In an embodiment, responsibility for operating one or more aspects of a particular machine may be transferred from an operator to a computing device, for example, depending at least in part on a substantially current biological and/or behavioral state of a particular operator. For example, fig. 10 depicts an area of responsibility, illustrated as area 1054, under control of a computing device decision-making system, device, and/or process (such as 1020). FIG. 10 also depicts the area of responsibility under the control of a particular operator, such as operator 1040, illustrated as area 1052. Also depicted in FIG. 10 is region 1050, which illustrates a region of shared responsibility between a computing device-based decision making system, device, and/or process (such as 1020), and operator-based control (such as control by operator 1040). In an embodiment, whether a particular operator, such as operator 1040, and/or a computing device-based decision making system, device, and/or process, such as system 1020, performs a task illustrated by region 1050 may depend, at least in part, on the substantially current biological and/or behavioral state of the particular operator. In an embodiment, such a determination may be made, for example, at least in part by a computing device-based decision making system, device, and/or process (such as system 1020) based, at least in part, on behavior summary content (such as behavior summary content 1015) obtained from a processor (such as behavior processing unit 1010).
Returning to the example of an operator (such as a pilot) flying an aircraft, the operator may control some aspects of the aircraft, while decision-making systems, devices, and/or processes (such as system 1020) based on the computing device control other aspects of the aircraft. For the present example, a region of variable responsibility, such as region 1050 of variable responsibility, may represent one or more aspects of aircraft operation that may be at least partially transferred from pilot control to a computing device depending at least in part on the substantially current biological and/or behavioral state of the pilot. For example, if a processor, such as behavior processing unit 1010, detects, for example, via sensor content, that a pilot is fatigued, distracted, angry, etc., beyond a specified threshold, responsibility and/or operation of flight controls (e.g., aircraft rudders, elevators, etc.) may be transferred from pilot control to control by a computing device-based decision making system, device, and/or process (such as system 1020).
In another example, the shared area of responsibility 1050 may represent a vehicle brake system that may be under at least partial control of a particular operator under normal conditions. A decision-making system, device, and/or process based on a computing device, such as system 1020, may assume control of a braking system at least partially in response to a processor, such as behavior processing unit 1010, generating behavior profile content for a particular operator that indicates a change in a substantially current biological and/or behavioral state of the particular operator, and the change may indicate a hazardous condition to the operator and/or to others. Of course, the subject matter is not limited to these particular examples.
In another example, anesthesia during a surgical procedure can be administered to a patient according to a computing device-based decision making system, device, and/or process (such as system 1020). In one embodiment, the surgeon may constitute an "operator" in that the surgeon may be monitored via one or more sensors. Administration of anesthesia may initially be based at least in part on the expected duration of the surgical procedure. A processor, such as behavior processing unit 1010, may generate behavior summary content for the surgeon, and a system, device, and/or process based on the decision making of the computing device, such as system 1020, may determine whether administration of anesthesia should be altered based at least in part on the current biological and/or behavioral state of the surgeon. For example, the behavioral summary content may indicate an increase in the level of stress experienced by the surgeon, and/or may indicate the surgeon's recognition of errors during the procedure. Systems, devices, and/or processes, such as system 1020, may alter administration of anesthesia based on an expected increase in the expected duration of a surgical procedure due, at least in part, to increased pressure levels and/or detected false positives based on decisions of a computing device, for example.
In another example involving surgery, a surgeon may utilize a robotic surgical device, which may be used manually, may be operated automatically, and/or may share operational aspects between the robotic surgical device and the surgeon (e.g., a machine-assisted mode of operation). In an embodiment, at least one processor, such as the system 1020, may initiate control of one or more aspects of the surgical treatment based at least in part on the behavioral summary content to transfer control of the surgical treatment from the surgeon to the robotic surgical device at least in part.
As another example, law enforcement personnel may carry one or more weapons. A processor, such as behavior processing unit 1010, may generate behavior summary content for a law enforcement officer, and a decision making system, device, and/or process based on a computing device, such as system 1020, may determine whether some aspect of functionality regarding a weapon of the law enforcement officer should be altered based at least in part on a current biological and/or behavioral state of the law enforcement officer. For example, the behavioral profile may indicate an increase in the level of irritation experienced by law enforcement personnel, and/or may indicate the degree of certain injuries with respect to law enforcement personnel. Decision-making systems, devices, and/or processes based on a computing device, such as system 1020, may, for example, determine to lock a safety on a weapon to prevent use of the weapon. Alternatively, for example, the behavioral summary content may provide instructions to a decision-making system, apparatus, and/or process (such as system 1020) to provide additional assistance in targeting weapons. For example, a weapon may be transitioned from a manual aiming mode to an assisted aiming mode based at least in part on the behavioral summary content.
FIG. 11 is an illustration of an embodiment 1100 of an example process for processing signals and/or states representing behavior content. Embodiments in accordance with the claimed subject matter can include all of blocks 1110-1120, less than blocks 1110-1120, and/or more than blocks 1110-1120. Additionally, the order of blocks 1110 and 1120 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
As indicated at block 1110, content may be obtained from one or more sensors, such as sensor 730. The sensor content may be processed, for example, to generate behavioral summary content for at least one particular operator, as indicated at block 1120. As described above, sensor content, such as signals and/or states obtained from sensors 730, may be processed by behavior processing units (such as behavior processing unit 700), which may include, for example, one or more sensor processing units, such as SPU705, and/or one or more machine learning units, such as machine learning unit 706. In an embodiment, sensor content may be processed, for example, at least in part, by a behavior profile content processor (such as behavior processing unit 700) that may include machine learning acceleration circuitry. For example, a behavior processing unit, such as behavior processing unit 700, may include one or more machine learning units, such as machine learning unit 706, as described above. Additionally, in an embodiment, the behavior content processor and/or the machine learning acceleration circuit may, for example, perform one or more particular operations to generate the behavior summary content. In an embodiment, the one or more particular operations performed at least in part by a machine learning unit (such as machine learning unit 706) may include, for example, multiplication, squaring/exponentiation, multiplicative inverse, and/or partial product operations, or any combination of these, that may be performed on a set of parameters according to one or more of a variety of possible machine learning techniques.
FIG. 12 is an illustration of an embodiment 1200 of an example process for processing signals and/or states representing behavior content. Embodiments in accordance with the claimed subject matter may include all of blocks 1210 + 1240, less than blocks 1210 + 1240, and/or more than blocks 1210 + 1240. Additionally, the order of blocks 1210-1240 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
In an embodiment, content related to particular operator and/or environment related content may be obtained from one or more sensors, as indicated at block 1210. As also indicated at block 1220, a sensor fusion operation may be performed on the sensor content. In an embodiment, the sensor fusion operations may include combining, normalizing, reducing, and/or otherwise processing sensor content in preparation for further processing of machine learning operations, e.g., as discussed above in connection with sensor 730 and/or SPU 705. In an embodiment, machine learning operations, such as discussed above in connection with the machine learning unit 706, may be performed on the sensor content, for example, to generate behavior summary content for a particular operator, as indicated at block 1230. In an embodiment, the machine learning operation may include, for example, one or more particular operations that may be performed by machine learning acceleration circuitry (such as machine learning unit 706) of a behavior processing unit (such as behavior processing unit 700). In an embodiment, the particular operation may include performing a computation of a multiplication, a square/power, a multiplicative inverse, and/or a partial product operation, or any combination of these.
In an embodiment, behavior summary content, such as behavior summary content 725, for example, may represent a substantially current behavior and/or biological state of a particular operator, as also indicated at block 1230. Additionally, as depicted at block 1240, a decision-making operation based at least in part on behavioral summary content and/or parameters representative of one or more external factors may be performed, such as discussed above in connection with decision-making system 740.
In an embodiment, a behavior processing unit, such as behavior processing unit 700, for example, may represent an improvement over other approaches, such as may include the use of a general purpose processing device. For example, specialized and/or dedicated circuitry, such as control circuitry 703, SPU 705, machine learning unit 706, etc., may more efficiently generate content, such as behavior summary content, based at least in part on sensor content, such as sensor content 730. In an embodiment, dedicated and/or specialized circuitry, such as behavior processing unit 700, for example, may consume relatively less power and/or less energy, may be implemented in a relatively smaller area on a semiconductor die, may be more responsive to changes in sensor content, and may generate behavior summary content more quickly, accurately, and/or reliably. For example, specialized hardware, such as the example embodiments described herein, may support the generation of relatively more accurate behavioral summary content than is possible with general-purpose hardware.
FIG. 13 is an illustration of an embodiment 1300 of an example process for processing signals and/or states representing behavioral summary content. Embodiments in accordance with the claimed subject matter may include all of blocks 1310- ' 1320, less than blocks 1310- ' 1320, and/or more than blocks 1310- ' 1320. Additionally, the order of blocks 1310-' 1320 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
As indicated at block 1310, one or more signals and/or states representing behavioral summary content of a particular user may be obtained via at least one processor of at least one computing device, such as processor 110, where the behavioral summary content may include a plurality of parameters, such as parameter 240, representing a substantially current behavioral state or biological state, or a combination thereof, of the particular user. Additionally, one or more recommendations for a particular user, as indicated at 1320, may be generated via at least one processor, such as processor 110, based at least in part on the behavioral summary content and/or based at least in part on one or more parameters representing external factors, or a combination of these, where the one or more recommendations may be substantially directed to improvements in the future state of the particular user. Additionally, as described above, embodiments may include generating behavioral summary content and/or recommendations intended to be communicated to one or more other individuals. For example, as described above, embodiments directed to collaborative mental health management may include, for example, sharing behavioral summary content and/or recommendations among personal computing devices within a group of individuals.
In an embodiment, generating one or more recommendations for a particular user may be generated, for example, at least in part by a behavior processing unit (such as BPU 700). That is, in embodiments, a behavior processing unit, such as BPU 700, may generate behavior summary content and/or may also perform machine learning operations on the behavior summary content and/or one or more parameters representing external factors to generate one or more recommendations for a particular user and/or perform other decision-making operations.
FIG. 14 is an illustration of an embodiment 1400 of an example process for tracking signals and/or states representing behavioral summary content. Embodiments in accordance with the claimed subject matter can include all of blocks 1410-1430, less than blocks 1410-1430, and/or more than blocks 1410-1430. Additionally, the order of blocks 1410-1430 is merely an example order, and the scope of the claimed subject matter is not limited in these respects.
As indicated at block 1410, one or more signals and/or states representing behavioral summary content of a particular user may be tracked via at least one processor of at least one computing device, wherein the behavioral summary content may include a plurality of parameters representing a substantially current behavioral state or biological state, or a combination thereof, of the particular user. Additionally, as indicated at 1420, signals and/or states representing tracked behavioral summary content may be stored in at least one memory of the computing device.
Additionally, as indicated at block 1430, one or more relationships between the tracked behavioral summary content and the bioavailability and/or balance of one or more particular substances within the body of the particular user may be determined, at least in part, via the at least one processor executing one or more machine learning operations. In an embodiment, the recommendation for the particular user may be generated via the at least one processor based at least in part on the behavioral summary content and/or based at least in part on one or more parameters representing external factors, or a combination of these, wherein the one or more recommendations may be substantially directed to improvements in the future state of the particular user.
In an embodiment, a decision-making apparatus, system, and/or process, such as decision-making system 740, may perform, at least in part via at least one processor, one or more machine learning operations to determine one or more relationships between the tracked behavioral summary content (such as behavioral summary content 725) and the bioavailability and/or balance of one or more particular substances within the body of a particular user, or a combination thereof. Embodiments may also include a decision-making device, system, and/or process, such as decision-making system 740, to generate one or more recommendations for supplements related to a particular substance or substances for a particular user, where the one or more recommendations may be directed to improvements in a subsequent state of the particular user.
In the context of this patent application, the term "connected," the term "component," and/or similar terms are intended to be physical, but not necessarily always tangible. Whether or not these terms refer to tangible subject matter may vary in a particular context of use. As an example, the tangible connection and/or the tangible connection path may be made, such as by a tangible electrical connection (such as a conductive path comprising a metal or other conductor) capable of conducting electrical current between two tangible components. Similarly, the tangible connection path may be at least partially affected and/or controlled such that, as is typically the case, the tangible connection path may be opened or closed, sometimes due to the effect of one or more externally derived signals (such as external currents and/or voltages, such as for electrical switches). Non-limiting examples of electrical switches include transistors, diodes, and the like. However, "connected" and/or "components" similarly, although physical in a particular context of use, may also be non-tangible, such as a connection between a client and a server over a network, which generally refers to the ability of the client and server to send, receive, and/or exchange communications, as discussed in more detail later. Additionally, the term "connection" may be utilized in the context of a neural network model, and may refer in an embodiment to a parameter passed between nodes, for example, which may include a parameter and/or a set of parameters representing an output value. Additionally, in an embodiment, the connections between nodes may include a weight parameter. For example, one or more weight parameters may, in an embodiment, operate on one or more parameters representing one or more output values, e.g., in a specified manner, to produce a connection, such as a connection between a node of a first layer and a node of a second layer.
Thus, the terms "coupled" and "connected" are used synonymously in a particular context of use, such as in a particular context of discussing tangible components. Similar terms may also be used in a manner to exhibit similar intent. Thus, "connected" is used to indicate that two or more tangible components or the like, for example, are in direct physical contact. Thus, with the previous example, two tangible components of the electrical connection are physically connected via a tangible electrical connection, as previously described. However, "coupled" is used to mean that two or more tangible components may be physically and physically brought into direct contact. However, it is also used to mean that two or more tangible components or the like are not necessarily physically in direct contact, but are capable of cooperating, remaining in contact, and/or interacting, such as by being "optically coupled". Similarly, the term "coupled" is also understood to mean indirectly connected. It is also noted that in the context of this patent application, since memories, such as memory components and/or memory states, are intended to be non-transitory, the term physical, at least where used in connection with a memory, necessarily means that such memory components and/or memory states (to continue the example) are tangible.
Further, in this patent application, in a particular context of use, such as where tangible components (and/or similarly, tangible materials) are discussed, a distinction is made between "on …" (on) and "over …" (over). By way of example, placing a substance "on" a substrate refers to placement involving direct physical and tangible contact, without a medium, such as an intervening substance, between the placed substance and the substrate in this latter example; however, "over" a substrate, although understood to potentially include being placed "on" a substrate (as "at …" may also be accurately described as "over …"), is understood to include situations where one or more media (such as one or more intermediate substances) are present between the placed substance and the substrate, such that the placed substance does not necessarily come into direct physical or tangible contact with the substrate.
Similar distinctions are made between "under …" (beneath) and "under …" (under) in the appropriate specific use context (such as the context in which tangible materials and/or tangible components are discussed). "under …" is intended in this particular context of use to necessarily imply physical and tangible contact (similar to "over …" just described), "under …" may include instances where direct physical and tangible contact exists, but does not necessarily imply direct physical and tangible contact, such as if one or more media (e.g., one or more intervening substances) are present. Thus, "above …" is to be understood as meaning "immediately above" and "below …" is to be understood as meaning "immediately below".
It is similarly to be understood that terms such as "above …" and "below …" are to be interpreted in a manner similar to the previously mentioned terms "upper", "lower", "top", "bottom", and the like. These terms may be used to facilitate discussion, but are not intended to necessarily limit the scope of the claimed subject matter. For example, the term "above …" as an example is not intended to imply that the scope of the claims is limited to embodiments that are right-side-up, e.g., as compared to an upside-down embodiment. One example includes flip-chip as one illustration, where, for example, the orientation at various times (e.g., during manufacturing) may not necessarily correspond to the orientation of the final product. Thus, if an object is recited as an example in a particular orientation within the scope of the applicable claims, such as an upside down orientation as one example, then it is similarly intended that the latter also be construed as being included in another orientation within the scope of the applicable claims, such as right side up as also an example, and vice versa, even if applicable literal claim language would otherwise be read. Of course, as is also the case throughout the specification of the patent application, the particular context of description and/or use provides helpful guidance as to reasonable inferences to be drawn.
Unless otherwise indicated, in the context of the present patent application, the term "or" if used in association lists, such as A, B or C, is intended to mean A, B and C, used herein in an inclusive sense, and A, B or C, used herein in an exclusive sense. With this understanding, "and" is used in an inclusive sense and is intended to mean A, B and C; and "and/or" may be used with great care to indicate that all of the foregoing meanings are intended, although such use is not required. Furthermore, the terms "one or more" and/or the like are used to describe any feature, structure, characteristic, or the like in the singular and "and/or" are also used to describe a plurality of features, structures, characteristics, or the like and/or some other combination of features, structures, characteristics, or the like. Similarly, the term "based on" and/or similar terms is to be understood as not necessarily intending to express an exhaustive list of factors, but rather to allow for the presence of additional factors not necessarily explicitly described.
Further, it is intended that situations involving implementation of the claimed subject matter and subject to testing, measurement, and/or specification with respect to degree be understood in the following manner. As an example, in a given scenario, it is assumed that a value of a physical property is to be measured. If alternative logical solutions to testing, measuring, and/or normalizing of degrees, at least with respect to the attribute (continuing the example), are reasonably likely to occur to one of ordinary skill in the art at least for implementation, the claimed subject matter is intended to cover such alternative logical solutions unless explicitly indicated otherwise. By way of example, if a graphical representation of a measurement over a region is generated and implementations of claimed subject matter refer to taking a measurement of a slope over the region, but a variety of reasonable alternative techniques exist for estimating the slope over the region, claimed subject matter is intended to cover such reasonable alternative techniques unless explicitly indicated otherwise.
To the extent that the claimed subject matter is related to one or more particular measurements, such as with respect to a physical manifestation capable of being physically measured, e.g., without limitation, temperature, pressure, voltage, current, electromagnetic radiation, etc., it is believed that the claimed subject matter is not a legal exception to the abstract concepts of the statutory subject matter. Rather, it is asserted that physical measurements are not thought steps and, similarly, are not abstractions.
However, it is noted that a typical measurement model employed is one in which one or more measurements may each comprise a sum of at least two components. Thus, for a given measurement, for example, one component may comprise a deterministic component, which in an ideal sense may comprise a physical value (e.g., sought via one or more measurements), often in the form of one or more signals, signal samples, and/or states, while one component may comprise a random component, which may have a variety of sources that may be challenging to quantify. Sometimes, for example, a lack of measurement accuracy may affect a given measurement. Thus, for the claimed subject matter, statistical or stochastic models can be used in addition to deterministic models as a solution to identification and/or prediction of one or more measurements that may relate to the claimed subject matter.
For example, a relatively large number of measurements may be collected to better estimate the deterministic component. Similarly, if the measurement changes, which can typically occur, some portion of the possible variance can be described as deterministic components and some portion of the variance can be described as random components. In general, it is desirable to have the random variance associated with the measurements relatively small, if feasible. That is, in general, it may be more desirable to be able to interpret a reasonable portion of the measurement variation in a deterministic manner (rather than a random manner) as an aid to recognition and/or predictability.
In accordance with these principles, a variety of techniques have begun to be used so that one or more measurements can be processed to better estimate the underlying deterministic components, as well as to estimate the components that may be random. These techniques may of course vary with the details surrounding a given situation. In general, however, more complex issues may involve the use of more complex techniques. In this regard, as mentioned above, one or more measurements of the physical manifestation may be modeled deterministically and/or stochastically. Employing a model allows collected measurements to be potentially identified and/or processed and/or may allow, for example, estimation and/or prediction of underlying deterministic components for later measurements to be taken. The given estimate may not be a perfect estimate; however, in general, it is expected that on average one or more estimates may better reflect the underlying deterministic component, e.g., if a random component is considered that may be included in one or more obtained measurements. Of course, in practice it is desirable to be able to generate a physically meaningful model of the process that influences the measurements to be taken, for example by an estimation scheme.
However, in some cases, as indicated, the potential impact may be complex. Therefore, attempting to understand the appropriate factors to consider can be particularly challenging. In this case, it is therefore not uncommon to employ heuristics for generating one or more estimates. Heuristics refer to the use of empirically related schemes that can reflect the process implemented and/or the results of the implementation, for example, for the use of historical measurements. Heuristics, for example, may be used in situations where more analytic schemes may be too complex and/or nearly elusive. Thus, with respect to the claimed subject matter, novel features may, in example embodiments, include heuristics that may, for example, be used to estimate and/or predict one or more measurements.
It is also noted that the terms "type" and/or "similar," if used, such as in connection with a feature, structure, characteristic, or the like, are used as a simple example of "optical" or "electrical" and mean that the feature, structure, characteristic, or the like is at least partially within and/or related to the feature, structure, characteristic, or the like in such a manner as: the presence of minor variations, even those that may not otherwise be considered consistent with the feature, structure, characteristic, etc., generally does not prevent the feature, structure, characteristic, etc., from being of a certain "type" and/or "similar" (e.g., "light type" or "light-like"), if these minor variations are sufficiently minor that the feature, structure, characteristic, etc., would still be considered substantially present if such variations were also present. Thus, continuing with this example, the terms light type and/or light-like properties are necessarily intended to include light properties. Similarly, the term electrical type and/or electrical-like properties is necessarily intended to include electrical properties as another example. It should be noted that the description of the present patent application provides one or more illustrative examples only, and the claimed subject matter is not intended to be limited to one or more illustrative examples; however, as has been the case with the description of the patent application, the particular context of description and/or use provides helpful guidance as to reasonable inferences to be drawn.
In the context of this patent application, the term network device refers to any device capable of communicating via and/or as part of a network and may include a computing device. Although network devices may be capable of communicating signals (e.g., signal packets and/or frames), such as via wired and/or wireless networks, they may also be capable of performing operations associated with the computing device, such as arithmetic and/or logical operations, processing and/or storing operations (e.g., storing signal samples), such as in memory as tangible, physical memory states, and/or in various embodiments may operate, for example, as a server device and/or a client device. A network device capable of operating as a server device, a client device, and/or otherwise may include, for example, a dedicated rack-mounted server, a desktop computer, a laptop computer, a set-top box, a tablet, a netbook, a smartphone, a wearable device, an integrated device combining two or more features of the foregoing devices, etc., or any combination of these. As described above, signal packets and/or frames may be exchanged, for example, such as between server devices and/or client devices, among other types of devices, including between wired and/or wireless devices coupled via a wired and/or wireless network, for example, or any combination of these. Note that the terms server, server device, server computing platform, and/or the like are used interchangeably. Similarly, the terms client, client device, client computing platform, and/or the like may also be used interchangeably. Although in some instances, for ease of description, these terms may be used in the singular, such as by reference to a "client device" or a "server device," the description is intended to encompass one or more client devices and/or one or more server devices, as appropriate. In accordance with similar principles, reference to a "database" is understood to mean one or more databases and/or portions thereof, as appropriate.
It should be understood that for ease of description, network devices (also referred to as networked devices) may be implemented and/or described in terms of computing devices, and vice versa. However, it should also be understood that this description should in no way be construed as limiting the claimed subject matter to one embodiment, such as only computing devices and/or only network devices, but rather may be implemented as a variety of devices or combinations thereof, including for example one or more illustrative examples.
The network may also include arrangements, derivations and/or improvements now known and/or later developed that, for example, include past, present and/or future mass storage, such as Network Attached Storage (NAS), Storage Area Network (SAN) and/or other forms of device-readable media. The network may include a portion of the internet, one or more Local Area Networks (LANs), one or more Wide Area Networks (WANs), wired connections, wireless connections, other connections, or any combination of these. Thus, the network may be worldwide in scope and/or breadth. Similarly, sub-networks, such as may employ different architectures and/or may be substantially compliant and/or substantially compatible with different protocols, such as network computing and/or communication protocols (e.g., network protocols), may interoperate within a larger network.
The term electronic file and/or the term electronic document are used throughout this document to refer to a set of stored memory states and/or a set of physical signals that are associated in a manner to thereby at least logically form a file (e.g., electronic) and/or an electronic document. That is, it is not intended to implicitly refer to a particular syntax, format, and/or scheme, e.g., used with respect to a set of associated memory states and/or a set of associated physical signals. It is explicitly mentioned if, for example, a specific type of file storage format and/or syntax is desired. It is also noted that the association of memory states, for example, may be in a logical sense, and not necessarily in a physical, tangible sense. Thus, while the signal and/or state components of a file and/or electronic document, for example, are to be logically associated, they may be stored in one embodiment in one or more different places, for example, in tangible physical memory.
In the context of the present patent application, the terms "item," "electronic item," "document," "electronic document," "content," "digital content," "item," and/or similar terms are intended to refer to a signal and/or state in a physical format (such as a digital signal and/or digital state format) that may be perceived by a user, for example if displayed, played, tangibly generated, etc., and/or otherwise executed by a device (such as a digital device, including, for example, a computing device), but may not necessarily be readily perceived by a human being (e.g., if in a digital format). Similarly, in the context of the present patent application, digital content provided to a user in a form that enables the user to readily perceive the underlying content itself (e.g., content presented in a human-consumable form such as, for example, listening to audio, perceiving a tactile sensation, and/or seeing an image) is referred to as "consuming" digital content, "consuming" of digital content, consumable "digital content, and/or similar terms for the user. For one or more embodiments, the electronic document and/or electronic file may, for example, comprise a web page of code (e.g., computer instructions) in a markup language that is or will be executed by the computing and/or networked device. In another embodiment, the electronic document and/or electronic file may comprise a portion and/or area of a web page. However, claimed subject matter is not intended to be limited in these respects.
In addition, for one or more embodiments, an electronic document and/or electronic file may include several components. As indicated previously, in the context of the present patent application, a component is physical, but not necessarily tangible. By way of example, components relating to electronic documents and/or electronic files may, in one or more embodiments, include text, for example, in the form of physical signals and/or physical states (e.g., capable of being physically displayed). Typically, a memory state includes, for example, tangible components, while a physical signal is not necessarily tangible, although it is not uncommon for a signal to become tangible (e.g., caused to be) if, for example, it appears on a tangible display. Additionally, for one or more embodiments, a component pertaining to an electronic document and/or electronic file may include a graphical object (e.g., an image, such as a digital image), and/or a sub-object, including attributes thereof, which again include a physical signal and/or a physical state (e.g., capable of being tangibly displayed). In an embodiment, for example, digital content may include, for example, text, images, audio, video, and/or other types of electronic documents and/or electronic files, including portions thereof.
Further, in the context of the present patent application, the term parameter (e.g., one or more parameters) refers to material that describes a collection of signal samples, such as one or more electronic documents and/or electronic files, and that exists in the form of physical signals and/or physical states (such as memory states). For example, the one or more parameters (such as relating to an electronic document and/or electronic file that includes an image) may include, for example, a time period during which the image was captured, a latitude and longitude of an image capture device (such as a camera), and so forth. In another example, the one or more parameters related to the digital content (such as, by way of example, digital content including technical articles) may, for example, include one or more authors. The claimed subject matter is not intended to be a meaningful descriptive parameter around any format so long as the one or more parameters include physical signals and/or states, which may include, as examples of parameters, collection names (e.g., electronic file and/or electronic document identifier names), techniques of creation, purposes of creation, times and dates of creation, logical paths (if stored), encoding formats (e.g., types of computer instructions such as a markup language), and/or standards and/or specifications used in order to comply with a protocol (e.g., meaning substantially compliant and/or substantially compatible) for one or more uses, etc.
Signal packet communications and/or signal frame communications, also referred to as signal packet transmissions and/or signal frame transmissions (or simply "signal packets" or "signal frames"), may be communicated between nodes of a network, where the nodes may include, for example, one or more network devices and/or one or more computing devices. As an illustrative example, without limitation, a node may include one or more sites that employ local network addresses, such as in a local network address space. Similarly, a device, such as a network device and/or a computing device, may be associated with the node. It is also noted that in the context of the present patent application, the term "transmission" is intended as another term for some type of signal communication that may occur in any of a variety of situations. Thus, no specific directionality of communication and/or a specific originating end of a communication path is intended to be implied with respect to "transporting" a communication. For example, mere use of the term itself is not intended in the context of the present patent application to have a particular implication for the signal or signals being communicated, e.g., whether a signal is communicated "to" a particular device, whether a signal is communicated "from" a particular device, and/or as to which end of a communication path may be initiating communication, e.g., in a "push type" signaling or in a "pull type" signaling. In the context of the present patent application, push and/or pull type signaling is distinguished by which end of the communication path initiates signaling.
Thus, signal packets and/or frames may be communicated from a site coupled to the internet via an access node, or vice versa, via communication channels and/or communication paths (such as including a portion of the internet and/or the Web), as examples. Similarly, signal packets and/or frames may be forwarded to a target station coupled to a local network, e.g., via a network node. Signal packets and/or frames communicated via the internet and/or the Web may be routed, such as "pushed" or "pulled," for example, via a path comprising one or more gateways, servers, etc., which may route the signal packets and/or frames, for example, substantially in accordance with the availability of the destination and/or destination address and a network path to the network node of the destination and/or destination address. While the Internet and/or the Web include networks of interoperable networks, not all of these interoperable networks are necessarily available and/or publicly accessible.
In the context of a particular patent application, a network protocol (such as for communicating between devices of a network) may be characterized at least in part substantially in accordance with a layered description, such as the so-called Open Systems Interconnection (OSI) seven layer scheme and/or description. Network computing and/or communication protocols (also referred to as network protocols) refer to a set of signaling conventions, such as those used for communication transmissions, that may occur between devices in a network, for example. In the context of this patent application, the term "between …" and/or similar terms are understood to include "between …" and vice versa if appropriate for the particular use. Similarly, in the context of the present patent application, the terms "compatible with", "complying with", and/or similar terms are to be understood as including substantially compatible with and/or substantially complying with, respectively.
A network protocol, such as the protocol substantially characterized in accordance with the OSI description above, has several layers. These layers are referred to as the network stack. Various types of communications (e.g., transmissions), such as network communications, may occur across various layers. The lowest level layers in the network stack, such as the so-called physical layer, may characterize how symbols (e.g., bits and/or bytes) may be communicated as one or more signals (and/or signal samples) via a physical medium (e.g., twisted copper pairs, coaxial cable, fiber optic cable, wireless air interface, combinations of these, and so on). Proceeding to higher level layers in the network protocol stack, additional operations and/or features may be available via participation in communications at these higher level layers that are substantially compatible and/or substantially compliant with a particular network protocol. For example, higher level layers of a network protocol may affect device permissions, user permissions, and so forth, for example.
The networks and/or sub-networks may in an embodiment communicate via signal packets and/or signal frames, such as via participating digital devices and may be substantially compliant and/or substantially compatible-but not limited to-now known and/or to-be-developed versions of any of the following network protocol stacks: ARCNET, AppleTalk, ATM, Bluetooth, DECnet, Ethernet, FDDI, frame Relay, HIPPI, IEEE 1394, IEEE 802.11, IEEE-488, Internet protocol suite, IPX, Myrinet, OSI protocol suite, QsNet, RS-232, SPX, System network architecture, token Ring, USB, and/or X.25. The networks and/or sub-networks may take on presently known and/or later developed versions of, for example: TCP/IP, UDP, DECnet, NetBEUI, IPX, AppleTalk, and the like. Versions of the Internet Protocol (IP) may include IPv4, IPv6, and/or other versions to be developed in the future.
With respect to aspects related to networks, including communication and/or computing networks, a wireless network may couple devices, including client devices, with a network. The Wireless network may employ a stand-alone ad-hoc network, a mesh network, a Wireless LAN (WLAN) network, a cellular network, and so forth. A wireless network may also include a system of terminals, gateways, routers, etc. coupled by wireless radio links, etc., that may move freely, randomly, and/or organize themselves arbitrarily such that the network topology may change, sometimes even rapidly. The Wireless network may also employ a number of network access technologies, including some version of Long Term Evolution (LTE), WLAN, Wireless Router (WR) mesh, 2 nd, 3 rd or 4 th generation (2G, 3G or 4G) cellular technologies, and so forth, whether currently known and/or later developed. Network access technologies may, for example, enable wide area coverage for devices with varying degrees of mobility, such as computing devices and/or network devices.
The network may enable Radio frequency and/or other wireless type communications via wireless network Access technologies and/or air interfaces, such as Global System for Mobile communications (GSM), Universal Mobile Telecommunications System (UMTS), Universal Packet Radio service (GPRS), Enhanced Data GSM Environment (Enhanced Data GSM Environment, EDGE), 3GPP Long Term Evolution (Long Term Evolution, LTE), LTE advanced, Wideband Code Division Multiple Access (WCDMA), bluetooth, ultra-Wideband (UWB), 802.11b/g/n, and so on. The wireless network may include virtually any type of wireless communication mechanism and/or wireless communication protocol (including of course the foregoing) now known and/or later developed by which signals may be communicated between devices, between networks, within a network, and so forth.
In an example embodiment, as shown in fig. 15, a system embodiment may include a local network (e.g., the device 1504 and the media 1540) and/or another type of network, such as a computing and/or communication network. Thus, for purposes of illustration, fig. 15 shows an embodiment 1500 of a system that can be employed to implement either or both types of networks. The network 1508 may include one or more network connections, links, processes, services, applications, and/or resources to facilitate and/or support communication, such as exchange of communication signals, for example, between a computing device (such as 1502) and another computing device (such as 1506), which may include, for example, one or more client computing devices and/or one or more server computing devices. By way of example, and not limitation, the network 1508 may include a wireless and/or wired communication link, a telephone and/or telecommunications system, a Wi-Fi network, a Wi-MAX network, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), or any combination of these.
The example device of FIG. 15 may include, for example, features of a client computing device and/or a server computing device in an embodiment. It is also noted that the term computing device, whether used as a client, a server, or otherwise, generally refers to at least a processor and memory connected by a communication bus. Similarly, at least in the context of the present patent application, this is understood to refer to sufficient structure within the meaning of 35USC § 112(f) such that it is particularly desirable that 35USC § 112(f) is not implied by the use of the term "computing device" and/or similar terms; however, if it is determined that the foregoing understanding cannot be established and 35USC 112(f) must therefore be implied by the use of the term "computing device" and/or similar terms for some reason not immediately apparent, then it is intended that the corresponding structures, materials, and/or acts for performing one or more functions be understood and interpreted as being described in at least the text associated with the foregoing figure(s), of the present patent application, for example, in fig. 1-14, in accordance with this regulatory section.
Embodiments in accordance with the claimed subject matter can include a method of executing computer instructions on at least one computing device without further human intervention, wherein the at least one computing device comprises at least one processor and at least one memory. An embodiment may include retrieving computer instructions from at least one memory of at least one computing device for execution on at least one processor of the at least one computing device, executing the retrieved computer instructions on the at least one processor of the at least one computing device, and storing in the at least one memory of the at least one computing device any results of the retrieved computer instructions being executed on the at least one processor of the at least one computing device. In an embodiment, the computer instructions to be executed comprise instructions for processing content representing a behavioral and/or biological state of a particular user, wherein executing the retrieved instructions further comprises obtaining, via at least one processor of the at least one computing device, one or more signals and/or states representing behavioral summary content of the particular user, wherein the behavioral summary content comprises a plurality of parameters representing a current behavioral state or biological state of the particular user, or a combination thereof, and generating, via the at least one processor, one or more recommendations for the particular user based at least in part on the behavioral summary content, or based at least in part on one or more parameters representing external factors, or a combination thereof, wherein the one or more recommendations are substantially directed to an improvement in a future state of the particular user.
Additionally, in an embodiment, obtaining one or more signals and/or states representative of the behavioral summary content may include iteratively obtaining updated behavioral summary content. For example, in an embodiment, updated behavioral summary content may be obtained periodically and/or at specified intervals.
In one embodiment, the behavioral summary content for a particular user may include a plurality of parameters, for example, representing: focus, excitement, anger, fear, fatigue, dehydration, concentration/distraction, pre-breakthrough, quiet love, regret/recognition error, hunger, grass rate/precision, comorbidity, or social engagement level, or any combination of these, although the scope of claimed subject matter is not limited in this respect. Additionally, the one or more parameters indicative of the external factor may, for example, include one or more parameters indicative of: location, time period, presence and/or identity of an external individual, or general mood, or a combination of these.
In an embodiment, generating the one or more recommendations may include performing, via at least one processor, one or more machine learning operations to determine one or more associations between external factors and/or behavioral summary content. Additionally, in an embodiment, the one or more parameters indicative of the external factors may include one or more parameters indicative of content currently consumed by a particular user. Further, for example, performing one or more machine learning operations may determine one or more associations between content currently consumed by a particular user and behavioral summary content to identify a silent favorites for the particular user.
In an embodiment, the one or more parameters representative of the external factors may include one or more parameters representative of content currently consumed by the particular user, and/or performing one or more machine learning operations may determine one or more associations between content currently consumed by the particular user and behavioral summary content to select subsequent content to be presented to the particular user. Additionally, the one or more parameters representative of the external factors may include one or more parameters representative of content currently consumed by the particular user, and/or generating one or more recommendations for the particular user may include performing one or more machine learning operations to determine one or more associations between content currently consumed by the particular user and behavioral summary content, wherein the one or more recommendations for the particular user may include one or more actions related to: dehydration, starvation, or fatigue, or a combination thereof.
In an embodiment, an apparatus may comprise at least one computing device comprising at least one processor and at least one memory, the at least one computing device to execute computer instructions on the at least one processor without further human intervention. In an embodiment, the computer instructions to be executed may be fetched from the at least one memory for execution on the at least one processor, and the at least one computing device may store in the at least one memory of the at least one computing device any results that would be generated from the execution of the computer instructions to be executed on the at least one processor. In an embodiment, the computer instructions to be executed may include instructions to process content representing a behavior and/or a biological state of a particular user. In an embodiment, at least one processor may obtain a signal and/or state representing behavioral summary content of a particular user, the behavioral summary content including a plurality of parameters representing a current behavioral state or a biological state of the particular user, or a combination thereof. Additionally, in an embodiment, the at least one memory may store signals and/or states representing behavior content, and/or the at least one processor may generate one or more recommendations for a particular user based at least in part on the behavior summary content and/or based at least in part on one or more parameters representing external factors, and/or combinations thereof, the one or more recommendations being for improvements in a future state of the particular user.
In an embodiment, the at least one processor may obtain behavioral summary content for a particular user at least in part from the behavioral content processor. Additionally, in an embodiment, the at least one processor may repeatedly obtain updated behavioral summary content. For example, the processor may obtain updated behavioral summary content periodically, and/or at specified intervals. Further, in an embodiment, the behavioral summary content of a particular user may include, for example, a plurality of parameters representing: focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, and/or any combination of these. In an embodiment, the behavioral profile of the particular user may also include one or more parameters, for example, representing: pre-breakthrough, quiet love, regret/acknowledge errors, hunger, grass rate/precision, comforting, or social engagement level, or any combination of these.
In an embodiment, the one or more parameters representative of the external factors may include, for example, one or more parameters representative of: location, time period, presence and/or identity of an external individual, or general mood, or a combination of these. Additionally, the at least one processor may, for example, perform one or more machine learning operations to determine one or more associations between external factors and/or behavioral summary content. In an embodiment, the at least one processor may generate one or more recommendations for a particular user based at least in part on the determined one or more associations between the external factors and/or the behavioral summary content. Additionally, in an embodiment, the one or more parameters indicative of the external factors may include one or more parameters indicative of content to be consumed by the particular user, and/or the at least one processor may perform one or more machine learning operations to determine one or more associations between content to be consumed by the particular user and/or behavioral summary content to identify the silent favorites for the particular user.
In an embodiment, the one or more parameters representative of the external factors may include one or more parameters representative of content to be consumed by the particular user, and/or the at least one processor may perform one or more machine learning operations to determine one or more associations between the content to be consumed by the particular user and the behavioral summary content to select subsequent content to be presented to the particular user. Additionally, in an embodiment, the one or more parameters representative of the external factors may include, for example, one or more parameters representative of content to be consumed by the particular user, and/or, to generate one or more recommendations for the particular user, the at least one processor may perform one or more machine learning operations to determine one or more associations between content to be consumed by the particular user and the behavioral summary content, wherein the one or more recommendations for the particular user may include one or more actions related to: dehydration, starvation, or fatigue, or a combination thereof.
Referring now again to fig. 15, in an embodiment, the first and third devices 1502 and 1506 can be capable of rendering a Graphical User Interface (GUI) for a network device and/or a computing device, for example, such that a user operator can participate in system use. Device 1504 may perform a similar function in this illustration. Similarly, in fig. 15, a computing device 1502 (the "first device" in the figure) may interface with a computing device 1504 (the "second device" in the figure), which computing device 1504 may also include features of a client computing device and/or a server computing device, for example, in an embodiment. A processor (e.g., processing device) 1520 and memory 1522, which can include a main memory 1524 and a secondary memory 1526, can communicate, for example, via a communication bus 1515. The term "computing device" in the context of this patent application refers to a system and/or device, such as a computing device, that includes the capability to process (e.g., perform computations) and/or store digital content in the form of signals and/or states, such as electronic files, electronic documents, parameters, measurements, text, images, video, audio, and so forth. Thus, a computing device, in the context of this patent application, may include hardware, software, firmware, or any combination of these (in addition to software) Book (I) Body of a shoeOther than that). Computing device 1504 as depicted in fig. 15 is but one example, and the scope of claimed subject matter is not limited to this particular example.
As described above, for one or more embodiments, a computing device may comprise, for example, any of a wide variety of digital electronic devices, including but not limited to desktop and/or notebook computers, high definition televisions, Digital Versatile Disks (DVDs) and/or other optical disk players and/or recorders, gaming consoles, satellite television receivers, cellular telephones, tablet devices, wearable devices, personal digital assistants, mobile audio and/or video playback and/or recording devices, or any combination of the foregoing. Additionally, processes such as those described with reference to the figures and/or otherwise may also be performed and/or affected, in whole or in part, by computing devices and/or network devices unless specifically stated otherwise. Devices, such as computing devices and/or network devices, may vary in terms of capabilities and/or features. The claimed subject matter is intended to cover various possible variations. For example, an apparatus may include a numeric keypad and/or other display of limited functionality, such as a Liquid Crystal Display (LCD) for displaying text. In contrast, however, as another example, a web-enabled device may include a physical and/or virtual keyboard, mass storage, one or more accelerometers, one or more gyroscopes, Global Positioning System (GPS) and/or other location-recognition type capabilities, and/or a display with a higher degree of functionality, such as a touch-sensitive color 2D or 3D display.
As previously suggested, communication between the computing device and/or the network device and the wireless network may be in accordance with known and/or yet to be developed network protocols including, for example, global system for mobile communications (GSM), enhanced data rates for GSM evolution (EDGE), 802.11b/g/n/h, etc., and/or Worldwide Interoperability for Microwave Access (WiMAX). The computing device and/or networked device may also have a Subscriber Identity Module (SIM) card, which may include, for example, a detachable or embedded smart card capable of storing the user's subscription content, and/or also capable of storing a contact list. The user may own the computing device and/or the network device or may otherwise be the user, such as a primary user. The device may be assigned an address by a wireless network operator, a wired network operator, and/or an Internet Service Provider (ISP). For example, the address may include a national or international telephone number, an Internet Protocol (IP) address, and/or one or more other identifiers. In other embodiments, the computing and/or communication network may be implemented as a wired network, a wireless network, or any combination of these.
The computing and/or networking devices may include and/or may execute a variety of now known and/or later developed operating systems, derivatives and/or versions thereof, including computer operating systems such as Windows, iOS, Linux, Mobile operating systems such as iOS, Android, Windows Mobile, and the like. The computing device and/or network device may include and/or may execute a variety of possible applications, such as a client software application that enables communication with other devices. For example, one or more messages (e.g., content) may be communicated over one or more protocols suitable for communication of e-mail, Short Message Service (SMS), and/or Multimedia Message Service (MMS), as now known and/or later developed, including over a network formed at least in part by a portion of a computing and/or communication network, such as a social network, including but not limited to Facebook, linkedln, Twitter, Flickr, and/or Google +, to provide just a few examples. The computing and/or network devices may also include executable computer instructions to process and/or communicate digital content, such as textual content, digital multimedia content, and so forth. The computing and/or network devices may also include executable computer instructions to perform a variety of possible tasks, such as browsing, searching, playing various forms of digital content, including locally stored and/or streamed video, and/or gaming, such as, but not limited to fantasy sports leagues. The foregoing is provided merely to illustrate that the claimed subject matter is intended to include a wide variety of possible features and/or capabilities.
In fig. 15, computing device 1502 may provide one or more sources of executable computer instructions, for example, in the form of physical states and/or signals (e.g., stored in memory states). The computing device 1502 may communicate with the computing device 1504 via a network connection, such as over a network 1508. As previously mentioned, a connection, while physical, may not necessarily be tangible. Although computing device 1504 of fig. 15 shows various tangible physical components, the claimed subject matter is not limited to computing devices having only these tangible components, as other implementations and/or embodiments may include alternative arrangements that may include additional or fewer tangible components that, for example, operate differently to achieve similar results. Rather, examples are provided for illustration only. The claimed subject matter is not intended to be limited in scope by the illustrative examples.
The memory 1522 may include any non-transitory storage mechanism. The memory 1522 may include, for example, a main memory 1524 and a secondary memory 1526, where additional memory circuits, mechanisms, or combinations thereof may be used. The memory 1522 may include, for example, random access memory, read only memory, and the like, such as in the form of one or more storage devices and/or systems, e.g., disk drives including optical disk drives, magnetic tape drives, solid state memory drives, and the like, to name a few.
The memory 1522 may be utilized to store programs of executable computer instructions. For example, processor 1520 may retrieve executable instructions from memory and then execute the retrieved instructions. Memory 1522 may also include a memory controller for accessing device-readable media 1540, which device-readable media 1540 may carry and/or may make accessible digital content, which may include, for example, code and/or instructions executable by processor 1520 and/or some other device capable of executing computer instructions (such as a controller, as one example). Under the direction of the processor 1520, a non-transitory memory, such as a memory unit storing a physical state (e.g., memory state), e.g., a program comprising executable computer instructions, may be executed by the processor 1520 and capable of generating signals to communicate via a network, for example, as previously described. The generated signal may also be stored in a memory, as also previously proposed.
The memory 1522 may store electronic files and/or electronic documents, such as those associated with one or more users, and may also include a computer-readable medium that may carry and/or be made accessible by content including code and/or instructions, such as executable by the processor 1520 and/or some other device capable of executing computer instructions (such as a controller, as one example). As previously mentioned, the term electronic file and/or the term electronic document is used throughout this document to refer to a set of stored memory states and/or a set of physical signals that are associated in a manner to thereby form an electronic file and/or an electronic document. That is, it is not intended to implicitly refer to a particular syntax, format, and/or scheme, e.g., used with respect to a set of associated memory states and/or a set of associated physical signals. It is also noted that the association of memory states, for example, may be in a logical sense, and not necessarily in a physical, tangible sense. Thus, while the signal and/or state components of an electronic file and/or electronic document are to be logically associated, they may be stored in one embodiment in one or more different places, such as may be present in tangible physical memory.
Algorithmic descriptions and/or symbolic representations are examples of signal processing and/or techniques used by those of ordinary skill in the relevant art to convey the substance of their work to others of ordinary skill in the art. An algorithm is, in the context of this patent application and generally, considered to be a self-consistent sequence of operations and/or similar signal processing leading to a desired result. In the context of this patent application, operations and/or processing involve physical manipulation of physical quantities. Typically, although not necessarily, such quantities may take the form of electrical and/or magnetic signals and/or states capable of being stored, transferred, combined, compared, processed and/or otherwise manipulated, such as electronic signals and/or states that are components of digital content in various forms, such as signal measurements, text, images, video, audio, and so forth.
It has proven convenient at times, principally for reasons of common usage, to refer to such physical signals and/or physical states as bits, values, elements, parameters, symbols, characters, terms, numbers, values, measurements, contents, or the like. It should be understood, however, that all of these and/or similar terms are to be associated with the appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, as apparent from the preceding discussion, it is appreciated that throughout this specification, discussions utilizing terms such as "processing," "computing," "calculating," "determining," "establishing," "obtaining," "identifying," "selecting," "generating," or the like, can refer to the action and/or processes of a particular apparatus, such as a special purpose computer and/or a similar special purpose computing and/or networking apparatus. Thus, in the context of this specification, a special purpose computer and/or similar special purpose computing and/or networking device is capable of processing, manipulating and/or transforming signals and/or states, typically in the form of physical electronic and/or magnetic quantities, within the memories, registers and/or other storage, processing and/or display devices of the special purpose computer and/or similar special purpose computing and/or networking device. In the context of this particular patent application, the term "particular device" therefore includes general purpose computing and/or network devices, such as a general purpose computer (once it is programmed to perform particular functions, such as executing according to program software instructions), as described above.
In some cases, the operation of the memory device, such as a change in state from a binary one to a binary zero (or vice versa), may include a transformation, such as a physical transformation, for example. For a particular type of memory device, such a physical transformation may include a physical transformation of an article to another, different state or thing. For example, and without limitation, for certain types of memory devices, a change in state may involve the accumulation of charge and/or the release of stored or stored charge. Similarly, in other memory devices, the change in state may include a physical change, such as a change in magnetic orientation. Similarly, a physical change may include a change in molecular structure, such as from a crystalline form to an amorphous form, or vice versa. In still other memory devices, the change in physical state may involve quantum mechanical phenomena, such as overlapping, entanglement, and so forth, which may involve, for example, a quantum bit (qubit). The foregoing is not intended as an exhaustive list of all examples in which a change in state from a binary one to a binary zero (or vice versa) in a memory device may include a transformation, such as a physical but non-transitory transformation. Rather, the foregoing is intended as an illustrative example.
Referring again to fig. 15, the processor 1520 may include one or more circuits, such as digital circuits, to perform at least a portion of the computing procedure and/or process. By way of example, and not limitation, the processor 1520 may include one or more processors, such as a controller, microprocessor, microcontroller, application specific integrated circuit, digital signal processor, programmable logic device, field programmable gate array, or the like, or any combination of these. In various implementations and/or embodiments, processor 1520 may perform signal processing, typically substantially in accordance with executable computer instructions retrieved, to, for example, manipulate signals and/or states, construct signals and/or states, etc., wherein the signals and/or states generated in such a manner are, for example, communicated to and/or stored in a memory.
Fig. 15 also illustrates the device 1504 as including a component 1532 that is operable in conjunction with an input/output device, for example, such that signals and/or states may be suitably communicated between devices, such as the device 1504 and the input device and/or the device 1504 and the output device. A user may utilize an input device such as a computer mouse, a stylus, a trackball, a keyboard, and/or any other similar device capable of receiving user actions and/or motions as input signals. Similarly, the user may utilize an output device, such as a display, printer, etc., and/or any other device capable of providing signals to the user and/or generating stimuli (such as visual stimuli, audio stimuli, and/or other similar stimuli).
In an embodiment, an apparatus may comprise: at least one processor for tracking signals and/or states representing behavioral summary content of a particular user, the behavioral summary content comprising a plurality of parameters representing a current behavioral state or biological state, or a combination thereof, of the particular user; at least one memory for storing tracked signals and/or states representing behavior content; wherein the at least one processor is configured to perform one or more machine learning operations to determine one or more relationships between the tracked behavioral summary content and a bioavailability or balance of one or more particular substances within a body of a particular user, or a combination thereof. In particular implementations, the at least one processor is further configured to generate one or more recommendations for supplements related to a particular one or more substances for a particular user, the one or more recommendations being for improvements in a subsequent state of the particular user, wherein the at least one processor is configured to generate the one or more recommendations based at least in part on current behavioral summary content of the particular user or based at least in part on one or more determined relationships, or a combination of these. In particular implementations, the behavioral summary content may include one or more parameters that represent eye movements of a particular user. In particular implementations, to perform one or more machine learning operations, at least one processor is configured to perform one or more training operations based at least in part on input to be obtained from one or more users, wherein the input includes content representative of ingestion of supplements related to a particular one or more substances by a particular user. In particular implementations, the one or more particular substances within the body of a particular user include gamma-aminobutyric acid (GABA) or glutamic acid, or a combination thereof. In particular implementations, the at least one processor is configured to perform one or more machine learning operations to determine one or more relationships between one or more parameters representative of eye movement and a balance between GABA and glutamic acid within a body of a particular user. In particular implementations, the one or more particular substances within the body of a particular user include 5-methyltetrahydrofolate (5-MTHF). In particular implementations, the at least one processor is configured to perform one or more machine learning operations to determine one or more relationships between one or more parameters representative of eye movement and bioavailability of 5-MTHF within a body of a particular user. In particular implementations, the behavioral summary content includes one or more parameters representing a voice tone, an emotional analysis, a volume, a frequency, a pitch, or a timbre of a particular user, or any combination of these, and wherein the one or more particular substances within the body of the particular user include 5-MTHF. In particular implementations, the at least one processor is further configured to track signals and/or states representing environmental noise associated with a particular user, and the at least one processor is configured to perform one or more additional machine learning operations to determine one or more relationships between the tracked environmental noise and changes in balance of one or more hormones, including testosterone, estrogen, or progestin, or a combination thereof.
An embodiment may include a method comprising: tracking, via at least one processor, a signal and/or state representing behavioral summary content of a particular user, wherein the behavioral summary content includes a plurality of parameters representing a current behavioral state or a biological state, or a combination thereof, of the particular user; storing the tracked signals and/or states representing behavior content in at least one memory; and performing, at least in part via the at least one processor, one or more machine learning operations to determine one or more relationships between the tracked behavioral summary content and a bioavailability or balance of one or more particular substances within a body of a particular user, or a combination thereof. In particular implementations, the method may further include generating, via at least one processor, one or more recommendations for a particular user for a supplement related to a particular one or more substances based at least in part on current behavioral summary content of the particular user or based at least in part on one or more determined relationships, or a combination thereof, wherein the one or more recommendations are for an improvement in a subsequent state of the particular user. In particular implementations, the behavioral summary content includes one or more parameters that represent eye movements of a particular user. In particular implementations, performing one or more machine learning operations includes performing one or more training operations based at least in part on input obtained from one or more users, where the input obtained from the one or more users includes content representing ingestion of supplements related to a particular one or more substances by a particular user. In particular implementations, the one or more particular substances within the body of a particular user may include gamma-aminobutyric acid (GABA) or glutamic acid, or a combination thereof. In particular implementations, determining one or more relationships between the tracked behavioral summary content and the bioavailability or balance of one or more particular substances within the body of the particular user, or a combination thereof, may include determining one or more relationships between one or more parameters indicative of eye movement and the balance between GABA and glutamate within the body of the particular user. In particular implementations, one or more particular substances within the body of a particular user may include 5-methyltetrahydrofolate. In particular implementations, determining one or more relationships between the tracked behavioral summary content and the bioavailability or balance, or a combination thereof, of one or more particular substances within the body of the particular user includes determining one or more relationships between one or more parameters indicative of eye movement and the bioavailability of 5-methyltetrahydrofolate within the body of the particular user. In particular implementations, the behavioral summary content includes one or more parameters representing a voice tone, emotional analysis, volume, frequency, pitch, or timbre of a particular user, or any combination of these, wherein the one or more particular substances within the body of the particular user include 5-MTHF. In particular implementations, the behavioral summary content includes one or more parameters indicative of one or more locations visited by the particular user, wherein determining one or more relationships between the tracked behavioral summary content and the bioavailability or balance of one or more particular substances within the body of the particular user, or a combination thereof, includes identifying one or more locations at which glutamate/GABA balance has been manipulated at least in part by identifying one or more changes in glutamate/GABA balance and identifying the one or more locations at which the changes occur.
In an embodiment, an apparatus may comprise: at least one memory for storing sensor content including signals and/or states obtained from one or more sensors; and behavioral content processing circuitry including machine learning circuitry to perform one or more specific machine learning operations to process the sensor content to generate behavioral summary content for at least one specific operator. In particular implementations, the sensor content processing circuitry may include a plurality of configurable sensor content processing units to individually perform particular sensor content processing operations. In particular implementations, the machine learning circuit may include a plurality of configurable machine learning units to individually perform particular machine learning techniques. In particular implementations, individual sensor content processing units or individual machine learning units, or a combination thereof, may be configurable at least in part according to one or more control signals generated by control circuitry at least in part in response to one or more sensor availability parameters, one or more sensor type parameters, one or more parameters describing a particular user, one or more environmental parameters, one or more behavioral profile designation parameters, or one or more parameters to be obtained from a decision-making system, or any combination thereof. In particular implementations, individual sensor content processing units may combine, adjust timing, reduce noise, convert from digital to analog, convert from analog to digital, or normalize one or more of the signals and/or states obtained from one or more sensors, or any combination of these. In particular implementations, the behavioral summary content of the at least one particular operator may include a plurality of parameters representing a substantially current behavioral state or biological state, or a combination thereof, of the at least one particular operator. In particular implementations, the plurality of parameters representing the substantially current behavioral or biological state, or a combination thereof, of the at least one particular operator may include one or more parameters representing: focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, or any combination of these. In particular implementations, the plurality of parameters representing the substantially current behavioral or biological state, or a combination thereof, of the at least one particular operator may include one or more parameters representing: pre-breakthrough, quiet love, regret/acknowledge errors, hunger, grass rate/precision, comforting, or social engagement level, or any combination of these. In particular implementations, the behavioral content processing circuit may repeatedly generate behavioral summary content. In particular implementations, the one or more sensors may include at least one camera, at least one microphone, at least one sweat and/or temperature sensor, at least one pressure sensor, at least one heart rate monitor, at least one hydration sensor, or at least one respiration sensor, or any combination of these. In particular implementations, the behavioral content processing circuitry may process sensor content from at least one microphone to generate one or more parameters representative of: volume, tone, or mood, or any combination of these. In particular implementations, the behavioral content processing circuitry may process sensor content from at least one camera to generate one or more parameters representative of: pupil dilation, focus, blink duration, or blink rate, or any combination of these. In particular implementations, the machine learning circuitry may process content representing one or more characteristics of a particular operator or user-generic content, or a combination thereof, to train the set of machine learning parameters.
An embodiment may include a method comprising: obtaining one or more signals and/or states from one or more sensors representative of sensor content; and processing the sensor content to generate behavioral summary content for the at least one particular operator at least in part with a behavioral content processor comprising machine learning circuitry that performs one or more particular machine learning operations. In particular implementations, the behavioral summary content of at least one particular operator may include a plurality of parameters representing a substantially current behavioral state or biological state, or a combination thereof, of the particular operator. In particular implementations, the machine learning circuit may include a plurality of operation units to respectively perform particular machine learning operations. Particular implementations may also include selecting one or more machine learning operational units to process sensor content based at least in part on the identity of a particular user. In particular implementations, the plurality of parameters representing the substantially current behavioral or biological state, or a combination thereof, of the at least one particular operator may include one or more parameters representing: focus, excitement, anger, fear, fatigue, dehydration, concentration/distraction, pre-breakthrough, quiet love, regret/recognition error, hunger, grass rate/precision, comorbidity, or social engagement level, or any combination of these. In particular implementations, processing the sensor content to generate behavioral summary content for a particular operator may include iteratively processing the sensor content to update the behavioral summary content. In particular implementations, iteratively processing the sensor content includes processing the sensor content at specified intervals. In particular implementations, processing the sensor content includes processing content obtained from at least one microphone to generate one or more parameters representative of: volume, tone, or mood, or any combination of these. In particular implementations, processing sensor content includes processing content obtained from at least one camera to generate one or more parameters representative of: pupil dilation, focus, blink duration, or blink rate, or any combination of these.
An embodiment may include an apparatus comprising: means for obtaining one or more signals and/or states from one or more sensors representative of sensor content; and means for processing the sensor content, including means for generating behavioral summary content for the at least one particular operator, wherein the behavioral summary content for the at least one particular operator includes a plurality of parameters representing a substantially current behavioral state or biological state, or a combination thereof, of the at least one particular operator. In particular implementations, the plurality of parameters representing the substantially current behavioral or biological state, or a combination thereof, of the at least one particular operator may include one or more parameters representing: focus, excitement, anger, fear, fatigue, dehydration, concentration/distraction, pre-breakthrough, quiet love, regret/recognition error, hunger, grass rate/precision, comorbidity, or social engagement level, or any combination of these.
In the foregoing description, various aspects of the claimed subject matter have been described. For purposes of explanation, specific details (e.g., number, system, and/or configuration) are set forth as examples. In other instances, well-known features are omitted and/or simplified in order not to obscure the claimed subject matter. While certain features have been illustrated and/or described herein, many modifications, substitutions, changes, and/or equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and/or changes as fall within the claimed subject matter.

Claims (24)

1. An apparatus, comprising:
at least one processor for obtaining a signal and/or state representing behavioral summary content of a particular user, the behavioral summary content comprising a plurality of parameters representing a current behavioral state or a biological state, or a combination thereof, of the particular user;
at least one memory for storing signals and/or states representing behavior content;
wherein the at least one processor is configured to generate one or more recommendations for the particular user based at least in part on the behavioral summary content or based at least in part on one or more parameters representing external factors, or a combination thereof, the one or more recommendations being for an improvement in a future state of the particular user.
2. The apparatus of claim 1, wherein the at least one processor is configured to obtain behavioral summary content for the particular user at least in part from a behavioral content processor.
3. The apparatus of claim 1 or claim 2, wherein the at least one processor is to iteratively obtain updated behavioral summary content.
4. The apparatus of any of claims 1 to 3, wherein the behavioral profile of the particular user comprises a plurality of parameters representing: focus, excitement, anger, fear, fatigue, dehydration, or concentration/distraction, or any combination of these.
5. The apparatus of any preceding claim, wherein the behavioral profile of the particular user comprises one or more parameters representative of: pre-breakthrough, quiet love, regret/acknowledge error, hunger, meadow/accuracy, comorbidity, confusion, or level of social engagement, or any combination of these.
6. The apparatus of any preceding claim, wherein the one or more parameters representative of external factors comprise one or more parameters representative of: location, time period, presence, identity and/or status of an external individual, or general mood, or a combination of these.
7. The apparatus of any preceding claim, wherein the at least one processor is to perform one or more machine learning operations to determine one or more relationships and/or associations between the external factors and/or the behavioral summary content.
8. An apparatus of claim 7, wherein the at least one processor is to generate the one or more recommendations for the particular user based at least in part on the determined one or more relationships and/or associations between the external factors and/or the behavioral summary content.
9. The apparatus of claim 8, wherein the one or more parameters representative of external factors comprise one or more parameters representative of content to be consumed by the particular user, and wherein the at least one processor is to perform the one or more machine learning operations to determine one or more relationships and/or associations between content to be consumed by the particular user and the behavioral summary content to identify a silent favorites for the particular user.
10. An apparatus as claimed in claim 8 or claim 9, wherein the one or more parameters representative of external factors include one or more parameters representative of content to be consumed by the particular user, and wherein the at least one processor is to perform the one or more machine learning operations to determine one or more relationships and/or associations between content to be consumed by the particular user and the behavioral summary content to select subsequent content to be presented to the user.
11. The apparatus of any of claims 8 to 10, wherein the one or more parameters representative of external factors comprise one or more parameters representative of content to be consumed by the particular user, and wherein, to generate the one or more recommendations for the particular user, the at least one processor is to perform the one or more machine learning operations to determine one or more relationships and/or associations between content to be consumed by the particular user and the behavioral summary content, wherein the one or more recommendations for the particular user comprise one or more actions related to: dehydration, starvation, or fatigue, or a combination of these.
12. A method, comprising:
obtaining, via at least one processor of at least one computing device, one or more signals and/or states representing behavioral summary content of a particular user, wherein the behavioral summary content includes a plurality of parameters representing a current behavioral state or biological state, or a combination thereof, of the particular user; and is
Generating, via the at least one processor, one or more recommendations for the particular user based at least in part on the behavioral summary content or based at least in part on one or more parameters representing external factors, or a combination thereof, wherein the one or more recommendations are substantially directed to improvements in the particular user's future state.
13. The method of claim 12, wherein obtaining the one or more signals and/or states representing the behavioral summary content comprises iteratively obtaining updated behavioral summary content.
14. The method of claim 12 or claim 13, wherein the behavioral profile of the particular user includes a plurality of parameters representing: focus, excitement, anger, fear, fatigue, dehydration, concentration/distraction, pre-breakthrough, quiet love, regret/recognition error, possible focus, hunger, grass rate/accuracy, comorbidity, confusion, or level of social engagement, or any combination of these.
15. The method of any of claims 12 to 14, wherein the one or more parameters representative of external factors comprise one or more parameters representative of: location, time period, presence, identity and/or status of an external individual, or general mood, or a combination of these.
16. The method of any of claims 12 to 15, wherein generating the one or more recommendations comprises performing one or more machine learning operations via the at least one processor to determine one or more relationships and/or associations between the external factors and/or the behavioral summary content.
17. The method of claim 16, wherein the one or more parameters representative of external factors include one or more parameters representative of content currently consumed by the particular user, and wherein performing the one or more machine learning operations determines one or more relationships and/or associations between content currently consumed by the particular user and the behavioral summary content to identify a silent favorites for the particular user.
18. The method of claim 16 or claim 17, wherein the one or more parameters representative of external factors include one or more parameters representative of content currently consumed by the particular user, and wherein performing the one or more machine learning operations determines one or more relationships and/or associations between content currently consumed by the particular user and the behavioral summary content to select and/or modify content for subsequent presentation to the user.
19. The method of any of claims 16 to 18, wherein the one or more parameters representative of external factors include one or more parameters representative of content currently consumed by the particular user, and wherein generating the one or more recommendations for the particular user includes performing the one or more machine learning operations to determine one or more relationships and/or associations between content currently consumed by the particular user and the behavioral summary content, wherein the one or more recommendations for the particular user include one or more actions related to: dehydration, starvation, or fatigue, or a combination of these.
20. An apparatus, comprising:
at least one processor for tracking a behavioral state or a biological state, or a combination thereof, of a particular operator based at least in part on signals and/or states obtained from a behavior processing unit representing behavioral summary content of the particular operator;
at least one memory for storing signals and/or states representing the behavioral profile content;
wherein the at least one processor is configured to detect a particular change in a behavioral state or a biological state, or a combination thereof, of the particular operator; and is
Wherein, at least partially in response to a detected change in the behavioral state or biological state, or a combination thereof, of the particular operator, the at least one processor is to initiate control of one or more aspects of a particular machine for a technical assistance task performed in contact with the particular operator to transfer control of the one or more aspects of the particular machine at least partially from the particular operator.
21. The apparatus of claim 20, wherein the detected change in the behavioral or biological state, or a combination thereof, of the particular operator comprises a change in a level of: anger, annoyance, regret, attention, concentration, accuracy, possible focus, or fatigue, or any combination of these.
22. The apparatus of claim 21, wherein the particular machine comprises a law enforcement weapon and wherein the particular operator comprises law enforcement personnel, and wherein the at least one processor is to initiate control of one or more aspects of a weapon firing system and/or a weapon aiming system to at least partially divert control of a weapon safety system and/or a weapon firing system from the particular operator.
23. The device of claim 21 or claim 22, wherein the particular machine comprises a robotic surgical device and the particular operator comprises a surgeon, and wherein the at least one processor is to initiate control of one or more aspects of a surgical procedure to transfer control of the surgical procedure from the surgeon at least in part to the robotic surgical device.
24. The apparatus of any of claims 20 to 23, wherein the behavior processing unit comprises:
at least one memory for storing sensor content including signals and/or states obtained from one or more sensors; and
behavioral content processing circuitry comprising machine learning circuitry to perform one or more particular machine learning operations to process the sensor content to generate behavioral summary content for at least one particular operator, wherein the behavioral content processing circuitry further comprises a plurality of configurable sensor content processing units to individually perform particular sensor content processing operations, and wherein the machine learning circuitry comprises a plurality of configurable machine learning units to individually perform particular machine learning techniques, and wherein individual sensor content processing units or individual machine learning units, or combinations thereof, are configurable at least in part according to one or more control signals that are at least partially responsive by the control circuitry to one or more sensor availability parameters, one or more sensor type parameters, one or more parameters describing a particular user, one or more sensor profile parameters, a user profile parameter, a user profile, One or more environmental parameters, one or more behavioral summary content specific parameters, or one or more parameters obtained from the decision-making system, or any combination of these.
CN201980029470.1A 2018-03-15 2019-03-12 Systems, devices, and/or processes for behavioral and/or biological state processing Pending CN112055865A (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
US15/922,687 US11259729B2 (en) 2018-03-15 2018-03-15 Systems, devices, and/or processes for behavioral and/or biological state processing
US15/922,644 2018-03-15
US15/922,687 2018-03-15
US15/922,671 US20190287028A1 (en) 2018-03-15 2018-03-15 Systems, devices, and/or processes for behavioral content processing
US15/922,671 2018-03-15
US15/922,644 US10373466B1 (en) 2018-03-15 2018-03-15 Systems, devices, and/or processes for tracking behavioral and/or biological state
PCT/GB2019/050691 WO2019175569A1 (en) 2018-03-15 2019-03-12 Systems, devices, and/or processes for behavioral and/or biological state processing

Publications (1)

Publication Number Publication Date
CN112055865A true CN112055865A (en) 2020-12-08

Family

ID=65904468

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980029470.1A Pending CN112055865A (en) 2018-03-15 2019-03-12 Systems, devices, and/or processes for behavioral and/or biological state processing

Country Status (2)

Country Link
CN (1) CN112055865A (en)
WO (1) WO2019175569A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11928163B2 (en) 2022-02-11 2024-03-12 Arm Limited Apparatus, method and computer program for creating digital memories for a particular person

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001039097A2 (en) * 1999-11-21 2001-05-31 Living Software Applications Ltd. An interactive system for providing psychological stimulation tuned to personal issues
CN101379493A (en) * 2006-02-06 2009-03-04 索尼株式会社 Information recommendation system based on biometric information
US9183761B1 (en) * 2012-05-02 2015-11-10 Samepage, Inc. Behavior management platform
RU2014129016A (en) * 2011-12-16 2016-02-10 Конинклейке Филипс Н.В. CHRONOLOGICAL JOURNAL OF USER ACTIONS AND ASSOCIATED EMOTIONAL STATES
US20160086500A1 (en) * 2012-10-09 2016-03-24 Kc Holdings I Personalized avatar responsive to user physical state and context
WO2017078288A1 (en) * 2015-11-02 2017-05-11 삼성전자 주식회사 Electronic device and method for generating user profile
US20170200449A1 (en) * 2011-04-22 2017-07-13 Angel A. Penilla Methods and vehicles for using determined mood of a human driver and moderating vehicle response
EP3293691A1 (en) * 2016-09-09 2018-03-14 Sony Corporation System and method for providing recommendation on an electronic device based on emotional state detection

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040133453A1 (en) * 2002-11-27 2004-07-08 Jean-Philippe Jomini Method and system for providing at home health care service
US9149236B2 (en) * 2013-02-04 2015-10-06 Intel Corporation Assessment and management of emotional state of a vehicle operator
CN203438860U (en) * 2013-07-25 2014-02-19 长安大学 Fatigue driving active monitoring system based on pressure sensor
GB201404234D0 (en) * 2014-03-11 2014-04-23 Realeyes O Method of generating web-based advertising inventory, and method of targeting web-based advertisements
JP6778872B2 (en) * 2016-06-28 2020-11-04 パナソニックIpマネジメント株式会社 Driving support device and driving support method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001039097A2 (en) * 1999-11-21 2001-05-31 Living Software Applications Ltd. An interactive system for providing psychological stimulation tuned to personal issues
CN101379493A (en) * 2006-02-06 2009-03-04 索尼株式会社 Information recommendation system based on biometric information
US20170200449A1 (en) * 2011-04-22 2017-07-13 Angel A. Penilla Methods and vehicles for using determined mood of a human driver and moderating vehicle response
RU2014129016A (en) * 2011-12-16 2016-02-10 Конинклейке Филипс Н.В. CHRONOLOGICAL JOURNAL OF USER ACTIONS AND ASSOCIATED EMOTIONAL STATES
US9183761B1 (en) * 2012-05-02 2015-11-10 Samepage, Inc. Behavior management platform
US20160086500A1 (en) * 2012-10-09 2016-03-24 Kc Holdings I Personalized avatar responsive to user physical state and context
WO2017078288A1 (en) * 2015-11-02 2017-05-11 삼성전자 주식회사 Electronic device and method for generating user profile
EP3293691A1 (en) * 2016-09-09 2018-03-14 Sony Corporation System and method for providing recommendation on an electronic device based on emotional state detection

Also Published As

Publication number Publication date
WO2019175569A1 (en) 2019-09-19

Similar Documents

Publication Publication Date Title
US11259729B2 (en) Systems, devices, and/or processes for behavioral and/or biological state processing
Lim et al. Emotion recognition using eye-tracking: taxonomy, review and current challenges
US11303976B2 (en) Production and control of cinematic content responsive to user emotional state
US10901509B2 (en) Wearable computing apparatus and method
US20210098110A1 (en) Digital Health Wellbeing
US20190287028A1 (en) Systems, devices, and/or processes for behavioral content processing
US9955902B2 (en) Notifying a user about a cause of emotional imbalance
US9805381B2 (en) Crowd-based scores for food from measurements of affective response
CN108574701B (en) System and method for determining user status
KR20240031439A (en) Personalized digital therapy methods and devices
Chassiakos et al. Current trends in digital media: How and why teens use technology
US11857507B2 (en) Infant feeding system
US20150332603A1 (en) Understanding data content emotionally
Troutman Integrating behaviorism and attachment theory in parent coaching
Heraz et al. Recognition of emotions conveyed by touch through force-sensitive screens: Observational study of humans and machine learning techniques
Spoladore et al. A semantic-enabled smart home for AAL and continuity of care
Rincon et al. Detecting emotions through non-invasive wearables
Mouchabac et al. Prevention of suicidal relapses in adolescents with a smartphone application: Bayesian network analysis of a preclinical trial using in silico patient simulations
US20190288837A1 (en) Systems, devices, and/or processes for omic content processing and/or communication
CN112055865A (en) Systems, devices, and/or processes for behavioral and/or biological state processing
Fadhil Towards automatic & personalised mobile health interventions: An interactive machine learning perspective
US10373466B1 (en) Systems, devices, and/or processes for tracking behavioral and/or biological state
Pravettoni et al. Cognitive science in telemedicine: from psychology to artificial intelligence
US10841299B2 (en) Systems, devices, and/or processes for omic content processing and/or partitioning
Patel et al. Mind gymnastics for good intellectual health of elderly people-MindGym

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination