WO2016109246A1 - Analyzing emotional state and activity based on unsolicited media information - Google Patents

Analyzing emotional state and activity based on unsolicited media information Download PDF

Info

Publication number
WO2016109246A1
WO2016109246A1 PCT/US2015/066573 US2015066573W WO2016109246A1 WO 2016109246 A1 WO2016109246 A1 WO 2016109246A1 US 2015066573 W US2015066573 W US 2015066573W WO 2016109246 A1 WO2016109246 A1 WO 2016109246A1
Authority
WO
WIPO (PCT)
Prior art keywords
activity
individual
emotional state
physiological
additional
Prior art date
Application number
PCT/US2015/066573
Other languages
French (fr)
Inventor
Curtis Lee
Naomi Furgiuele
Richard FOUGERE
Sophie Edgar
Ryan Walsh
Original Assignee
Johnson & Johnson Consumer Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Johnson & Johnson Consumer Inc. filed Critical Johnson & Johnson Consumer Inc.
Publication of WO2016109246A1 publication Critical patent/WO2016109246A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/0022Monitoring a patient using a global network, e.g. telephone networks, internet
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS

Definitions

  • the present invention relates to health monitoring, and more specifically, but not exclusively, to analyzing emotional state and activity of an individual.
  • An activity tracker is usually a device that includes or is connected to one or more sensors that monitor various aspects of an activity.
  • the one or more sensors can monitor aspects such as distance walked or run, calorie consumption, and in some cases heartbeat and quality of sleep.
  • these devices are synced either wired or wirelessly to a computer or smartphone for long-term data tracking. Further, in some cases, these devices are connected wired or wirelessly to a network such that various aspects of the activity can be shared with other individuals on social media websites.
  • FIG. 1 is a simplified schematic diagram of a system according to one embodiment that is configured to monitor activity and emotional state of an individual;
  • FIG. 2 is a simplified schematic diagram of a computing device according to one embodiment that may be used to implement any of the computing devices in the system of Fig. i ;
  • Fig. 3 shows Table I, which illustrates an exemplary database of physiological baseline profiles according to one embodiment that may be stored in the memory of the device of Fig. 2;
  • Fig. 4 shows Table II, which illustrates an exemplary database of physiological baseline profiles according to another embodiment that may be stored in the memory of the device of Fig. 2;
  • Fig. 5 shows Table III, which illustrates an exemplary database of time-indexed physiological profiles according to one embodiment that may be stored in the memory of the device of Fig. 2;
  • Fig. 6 is simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to analyze an activity engaged in by an individual and an emotional state of the individual during the activity;
  • Fig. 7 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to detect a change in a sensor signal;
  • Fig. 8 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to predict one or both of an activity engaged in by an individual and an emotional state of the individual during the activity;
  • FIG. 9 is a simplified flow diagram of a method according to another
  • Fig. 10 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to observe one or both of an activity and an emotional state of an individual using unsolicited media information;
  • Fig. 11 is a simplified flow diagram of a method according to another embodiment that may be implemented by the device of Fig. 2 to analyze an activity engaged in by an individual and an emotional state of the individual during the activity.
  • Various aspects of the disclosure relate to systems, devices, and methods for analyzing not just activities engaged in by an individual, but also the emotional state of the individual while engaging in each activity (i.e., how the individual feels while engaging in the activity).
  • the activities analyzed can include broader categories of activities such as (without limitation) exercising, relaxing, socializing, working, and so on.
  • the activities can in addition or alternatively include more refined activities within each broad category. For example, exercising can include running, walking, biking, and so on; relaxing can include sleeping (including the various stages of rapid eye movement (REM) and non-REM sleep), reading, watching television, and so on; and socializing can include interacting with specific individuals, attending a party, and so on.
  • REM rapid eye movement
  • socializing can include interacting with specific individuals, attending a party, and so on.
  • the emotional states analyzed can include broader categories of emotional states and/or more refined emotional states.
  • Broader categories of emotional states can include (without limitation) stressed, relaxed, happy, unhappy, and so on.
  • More refined emotional states can include, for example, ashamed, anxious, distressed, depressed, blissful, deeply relaxed, excited, sick, tired, and so on.
  • At least some emotional states can have physical manifestations that can be detected using physiological sensors.
  • suggestions can be made to enable the individual to alter his or her activities to improve the individual's health.
  • suggestions can be made to avoid certain activities that tend to cause the individual to be stressed or unhappy, or otherwise adversely affect the individual's health, while other suggestions can be made to engage more frequently in activities that cause the individual to be happy, or otherwise positively affect the individual's health.
  • stress has been shown to adversely affect health. Stress can exacerbate certain medical conditions and illnesses such as (without limitation) diabetes, and can even lead to, or accelerate, the development of a medical condition or illness.
  • stress has been shown to promote the increase of blood glucose levels of a person with diabetes and has been shown to expedite the onset of diabetes. Stress has also been shown to slow healing after a surgery or injury. However, often an individual is not able to determine if he or she is stressed.
  • a system 100 that monitors changes in the activity and emotional state of a first individual 104 over time.
  • the system 100 includes one or more computing devices 106 associated with the first individual 104 and one or more sensors 108 associated with the first individual 104 that are in communication with the one or more computing devices 106.
  • the system 100 can further include a communications network 102 such as the internet configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106 and the one or more sensors 108.
  • the system 100 can yet further include a server 116 configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106, the one or more sensors 108, and the communications network 102.
  • the system 100 can yet still further include one or more computing devices 112 and one or more sensors 114 that are (i) associated with a second individual 110 and (ii) configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106, the one or more sensors 108, the
  • the one or more computing devices 106 associated with the first individual 104 can include any suitable computing device, such as (without limitation) one or more of a personal computer 106a, a mobile phone 106b, a tablet 106c, and a camera 106d.
  • the one or more computing devices 106 can communicate with one another over a local area network or can communicate with one another over a network that includes one or more of the server 116 and the communications network 102.
  • the one or more sensors 108 associated with the first individual 104 include one or more physiological sensors 108a-b and can further include one or more environmental sensors 108c. Each of the one or more sensors 108 associated with the first individual 104 can be implemented as part of one or more computing devices 106 or can be implemented separate from the one or more computing devices 106. Each sensor 108 can be a wearable sensor that is worn by (e.g., physically attached to) the first individual 104, a sensor that is otherwise supported by the first individual 104 (e.g., supported in a pocket of clothing worn by the individual), or a sensor that is spaced apart from, but focused on, the first individual 104. In Fig. 1, three sensors, including two physiological sensors 108a and 108b and one environmental sensor 108c, are shown for illustrative purposes. However, the one or more sensors 108 may include as few as one sensor 108 or more than two sensors 108.
  • the one or more physiological sensors 108a-b can include any suitable physiological sensor that characterizes one or more physiological properties of the first individual 104.
  • the one or more physiological sensors 108a-b can include (without limitation) one or more of heart rate sensors, respiration sensors, galvanic skin response sensors, skin temperature sensors, electrocardiography (e.g., ECG or EKG) sensors, electromyography (EMG) sensors, motion sensors, oxygen sensors, blood pressure sensors, and global positioning sensors (GPS).
  • the one or more physiological properties characterized by the one or more physiological sensors can include (without limitation) one or more of heart rate, heart rate variability, respiration, pulse profile, perspiration, skin temperature, blood temperature, electrical activity of the heart, electrical activity of muscles, motion of the individual, acceleration of the individual, body position of the individual, oxygen saturation of the blood, blood pressure, and location.
  • the one or more physiological sensors 108a-b can characterize the one or more physiological properties as (without limitation) one or more of a waveform, a maximum value and minimum value, an average value, a median value, an instantaneous value, and so on.
  • physiological sensors include (without limitation) (i) ViSi® Mobile, which is capable of characterizing heart rate, respiration, pulse oximetry, blood pressure, and skin temperature, (ii) BodyMedia® Sensewear®, which is capable of characterizing body motion, temperature, and galvanic skin response, and (iii) Dexcom G4 Platinum®, which is capable of continuously characterizing blood glucose levels.
  • the one or more environmental sensors 108c associated with the first individual 104 can include any suitable environmental sensor that characterizes an environmental property in the immediate vicinity of the first individual 104.
  • the at least one environmental sensor 108c can characterize environmental properties such as (without limitation) one or more of temperature, pressure, speed, quality, and moisture level of the air that is in contact with the first individual 104.
  • the at least one environmental sensor 108c can further characterize one or more of the intensity of light rays that are in contact with the first individual 104 and the level of noise that reaches the first individual's ears.
  • the one or more computing devices 112 associated with the second individual 110 can include any suitable computing device, such as (without limitation) one or more of a personal computer 112a, a mobile phone 112b, a tablet 112c, and a camera 112d.
  • the one or more computing devices 112 can communicate with one another over a local area network or can communicate with one another over a network that includes one or more of the server 116 and the communications network 102.
  • the one or more sensors 114 associated with the second individual 110 include one or more physiological sensors 114a-b and can further include one or more environmental sensors 114c. Each of the one or more sensors 114 associated with the second individual 110 can be implemented as part of the one or more computing devices 112 or can be implemented separate from the one or more computing devices 112.
  • the one or more physiological sensors 114a-b can include any suitable physiological sensor that characterizes one or more
  • physiological properties of the second individual 110 including (without limitation) the physiological sensors and physiological properties discussed above in relation to the one or more physiological sensors 108a-b.
  • the one or more environmental sensors 114c can include any suitable environmental sensor that characterizes an environment in the immediate vicinity of the second individual 110.
  • the one or more environmental sensors 114c can characterize (without limitation) one or more of temperature, pressure, speed, and moisture level of the air that is in contact with the second individual 110.
  • the one or more environmental sensors 114c can further characterize one or more of the intensity of light rays that are in contact with the second individual 110 and the level of noise that reaches the second individual's ears.
  • the one or more sensors 114 may include as few as one sensor 114 or more than two sensors 114.
  • the server 116 can communicate with one or more of the computing devices 106 and 112 directly or can communicate with one or more of the computing devices 106 and 112 through the communications network 102.
  • the one or more sensors 108 can communicate directly or indirectly with at least one of the one or more computing devices 106, the communications network 102, and the server 116.
  • the one or more sensors 108 can communicate with the server 116 indirectly through the one or more computing devices 106.
  • the one or more sensors 114 communicate directly or indirectly with at least one of the one or more computing devices 112, the communications network 102, and the server 116.
  • a direct communication refers to a communication between a first communications device and a second communications device without communicating with an intervening communication device between the first and second communications devices.
  • an indirect communication refers to a communication between a first communications device and a second communications device in which the communication occurs through an intervening communication device between the first and second
  • communications devices refers to a device that transmits and/or receives communications signals, including (without limitation) the computing devices 106 and 112, the sensors 108 and 114, the server 116, and the network 102.
  • a computing device 200 is shown according to one embodiment that may be used to implement any of (i) the one or more computing devices 106 and (ii) the server 116.
  • the computing device 200 includes a receiver 202, memory 206, and a processor 204 that is in communication with the receiver 202 and memory 206.
  • the computing device 200 can further include at least one notification device 214 in communication with the processor 204 and/or at least one user input device 216 in communication with the processor 204.
  • the computing device 200 can include one or more of the sensors 108 or some or all of the sensors 108 can be separate from the computing device 200.
  • the receiver 202 communicates wired or wirelessly with the one or more physiological sensors 108a-b to receive sensor signals 220.
  • the receiver 202 can further communicate wired or wirelessly with the communications network 102 to receive network- accessible media information 222 that relates to activity and emotional state of the first individual 104.
  • the receiver 202 can yet further communicate wired or wirelessly with the at least one environmental sensor 108c to receive environmental information 228 that characterizes an environmental property perceived by the first individual 104.
  • the network-accessible media information 222 includes unsolicited, first-party information meaning that the information (i) is generated by the first individual 104, (ii) concerns an activity and emotional state of the first individual 104, and (iii) is generated without the processor 204 prompting the first individual 104 to characterize either of the activity or emotional state.
  • the unsolicited, first-party information can include (without limitation) information that can be mined over the communications network 102 from one or more of sources external to the computing device 200, including (without limitation) social media websites such as Facebook®, Twitter®, Linkedln®, Pinterest®, Google+®, Reddit®, MySpace®, and Instagram ®, internet email sites such as Gmail® and Hotmail®, exercise websites, food and diet sites such as Yelp and Foursquare®, online ordering, bank accounts, and media streaming sites such as Netflix® and Hulu®.
  • social media websites such as Facebook®, Twitter®, Linkedln®, Pinterest®, Google+®, Reddit®, MySpace®, and Instagram ®
  • social media websites such as Facebook®, Twitter®, Linkedln®, Pinterest®, Google+®, Reddit®, MySpace®, and Instagram ®
  • Gmail® and Hotmail® internet email sites
  • exercise websites such as Yelp and Foursquare®
  • food and diet sites such as Yelp and Foursquare®
  • the network-accessible media information 222 can further include third-party information that (i) is generated by individuals other than the first individual 104, including the second individual 110, and (ii) relates to an activity and emotional state of the first individual 104.
  • the third-party information can include one or both of information that is solicited from the third party and unsolicited information that can be mined over the communications network 102 from one or more sources including (without limitation) the one or more network sources listed above. Further, the third-party information can be mined from the one or more computing devices 112 and from the one or more sensors 114 associated with the second user 110.
  • the at least one notification device 214 provides notifications 224 to the first individual 104 concerning activities and emotional states of the first individual 104.
  • the notifications 224 may be displayed by (without limitation) one or more of a visual notification, an audible notification, and a tactile notification.
  • the at least one notification device 214 can include (without limitation) one or more of a display screen that provides the notifications 224 by displaying a message, a light that provides notification by illuminating, a speaker that provides notification by playing an audible sound, a vibration alert motor that provides notification by vibrating the computing device 200, and any other suitable device for notifying the first individual 104.
  • the at least one notification device 214 can also prompt the first individual 104 for information concerning activities and emotional states of the first individual 104.
  • the computing device 200 can further include at least one user input device 216 that facilitates the provision of user input 226 from the first individual 104 to the processor 204.
  • the at least one user input device 216 can include (without limitation) one or more of a touchscreen display screen, a keyboard, a keypad, a mouse, a microphone, and any other suitable device that enables the first individual 104 to provide the user input 226.
  • the at least one user input device 216 can be used by the first individual 104 to provide the user input 226 to the processor 204 in response to prompts provided by the processor 204 or by another device such as computing device 106 or a sensor 108.
  • the first individual 104 can respond to prompts displayed by at least one notification device 214 that prompt the first individual 104 for information regarding one or both of the activity and emotional state of the first individual 104 at a particular time period.
  • Information that is input by the first individual 104 in response to prompting by the processor 204 or by another device is referred to herein as solicited, first-party information.
  • the memory 206 can be, for example, any suitable non-transitory and tangible storage medium.
  • the memory 206 stores a program 208 that is run by the processor 204 to analyze changes in activity and emotional state of the first individual 104.
  • the memory 206 also stores one or more databases for the first individual 104, which includes a database of one or more baseline physiological profiles 210 for the first individual 104 and can include a database of one or more time-indexed physiological profiles 212 for the first individual 104.
  • the memory 206 can further store media information 218 that can be used by the processor 204 to analyze the activity and emotional state of the first individual 104. This locally-stored media information 218 includes information that concerns an activity and emotional state of the individual 104.
  • the locally-stored media information 218 can be solicited information that was produced in response to a prompt generated the processor 204 or unsolicited information that was not produced in response to a prompt generated by the processor 204. Further, the locally-stored media information 218 can include information that is generated by any one of the first individual 104 (i.e., first-party information), another individual (i.e., third-party information), and an application run by the device 200. Examples of the locally-stored media information 218 include (without limitation) internet browsing history, shopping lists, emails, calendar appointments, text messages, pictures, video recordings, and audio recordings.
  • Each baseline physiological profile 210 summarizes a typical physiological response for the first individual 104 during at least one activity and at least one emotional state.
  • Each baseline physiological profile 210 can be either complete or incomplete.
  • Each complete baseline physiological profile 210 includes (i) at least one characterization of a physiological property, (ii) at least one expected activity that results in the at least one characterization, and (iii) at least one expected emotional state that is expected during the at least one activity.
  • Each characterization of a physiological property is a mathematical interpretation or representation of the one or more sensor signals 220.
  • the at least one expected activity is an activity engaged in by the first individual 104 that normally results in the at least one characterization of a physiological property.
  • the at least one expected emotional state is an emotional state of the first individual 104 that normally results in the at least one characterization of a physiological property.
  • Each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) at least one characterization of a physiological property, (ii) at least one expected activity that results in the at least one characterization, and (iii) at least one expected emotional state that is expected during the at least one activity.
  • each row corresponds to one baseline physiological profile 210
  • the second column corresponds to an expected activity
  • the third column corresponds to an expected emotional state
  • Each complete baseline physiological profile 210 includes a single activity and a single emotional state.
  • the database 300 includes N baseline physiological profiles 210, where the number N is greater than one and can optionally increase over time as profiles are added to the database 300.
  • the database 300 includes one or more complete baseline physiological profiles 210, where each complete baseline physiological profile 210 includes an activity, an emotional state, and a threshold or thresholds for a physiological property.
  • the database 300 can further include one or more incomplete baseline physiological profiles 210, where each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) an activity, (ii) an emotional state, and (iii) a threshold for a physiological property.
  • each baseline physiological profile 210 can also store environmental properties typically observed during the corresponding activity.
  • THn(x,U) an upper threshold
  • THn(x,L) a lower threshold
  • characterization can include one or more of raw data, an instantaneous value, a median value, an average value, a percentage change, a standard deviation, a slope or any other suitable interpretation of the physiological property.
  • the database 210 can store upper and lower thresholds THn(l,U) and
  • THn(l,L) of the heart rate expected when the first user 104 is walking, running, and so forth Heart rate tends to be higher for running than walking, and therefore, the upper and lower thresholds for running may be higher than those for walking.
  • each baseline physiological profile 210 stores a threshold or thresholds for a single physiological property x.
  • each row corresponds to one baseline profile 210
  • the third column corresponds to one or more expected activities
  • the fourth column corresponds to one or more expected emotional states.
  • the database 400 includes N baseline physiological profiles 210, where the number N is greater than one and can optionally increase over time as profiles are added to the database 400.
  • the database 400 includes one or more complete baseline physiological profiles 210, where each complete baseline physiological profile 210 includes a characterization of a single physiological property, at least one expected activity, and at least one expected emotional state.
  • a baseline physiological profile 210 can include more than one expected activity and/or more than one expected emotional state.
  • the database 400 can further include one or more incomplete baseline physiological profiles 210, where each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) a characterization of a physiological property, (ii) at least one expected activity, and (iii) at least one expected emotional state.
  • Each physiological property can be characterized in any of the manners discussed above in relation to Fig. 3.
  • each baseline physiological profile 210 can also store environmental properties characterized by one or more environmental sensors 108c.
  • Each time-indexed physiological profile 212 summarizes a physiological response for the first individual 104 during a particular time period. Similar to baseline physiological profiles 210, each time-indexed physiological profile 212 can be either complete or incomplete.
  • Each complete time-indexed physiological profile 212 includes (i) a time period, (ii) at least one characterization of a physiological property recorded during the time period, (iii) an expected activity during the time period, and (iv) an expected emotional state during the time period.
  • each complete time-indexed physiological profile can include one or more of an observed activity and an observed emotional state that are observed from information retrieved from sources other than the one or more sensor signals 220 (e.g., from network- accessible media information 222, locally-stored media information 218 and/or user input 226) as discussed in further detail below.
  • each time-indexed physiological profile 212 can also store environmental properties characterized by one or more environmental sensors 108c during the time period.
  • Each incomplete time-indexed physiological profile 212 can exclude one or more, but not all, of (i) a time period, (ii) at least one characterization of a physiological property recorded during the time period, (iii) an expected activity during the time period, and (iv) an expected emotional state during the time period.
  • Each characterization can include a maximum MAXt(x,U) and a minimum MINt(x,L) of the physiological property x observed for the activity during a time period t.
  • each characterization can include at least one of raw data, an instantaneous value, a median value, an average value, or any other suitable interpretation of the physiological property x during the time period t.
  • the processor 204 receives (i) sensor signals 220 from the one or more physiological sensors 108a-b and (ii) media information from one or more of the communications network 102, the memory 206, and the at least one user input device 216.
  • the processor 204 predicts one or both of an activity and emotional state of the first individual 104 at a time period based on a comparison of the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206.
  • the processor 204 compares one or both of the expected activity and expected emotional state determined from the baseline physiological profiles 210 to an observed activity and observed emotional state, respectively, determined from one or more of the media information 218, 222, and 224. In yet another aspect, the processor 204 provides suggestions to the first individual 104 based on at least one of the expected activity, the expected emotional state, the observed activity, and the observed emotional state.
  • the processor 204 predicts (i) one of an activity and emotional state of the first individual 104 based on a comparison of the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206 and (ii) the other of the activity and emotional state of the first individual 104 based on an observed activity and observed emotional state, respectively, determined from one or more of the media information 218, 222, and 224.
  • the processor 204 creates new baseline
  • the processor 204 supplements incomplete baseline physiological profiles 210 based on the one or more sensor signals 220 and/or one or more of the media information 218, 222, and 224.
  • a computing device similar to computing device 200 can be used to implement any of the one of the computing devices 112 and the server 116 to analyze changes in activities and emotional states of the second individual 110 over time.
  • the foregoing description applies similarly to the computing device 112 or server 116, although references to the sensors 108a-c would be replaced with sensors 114a-c and references to the first individual 104 would be replaced with the second individual 110.
  • steps 602 to 610 are performed to predict a change in one or both of an activity and emotional state of the first individual 104 based on (i) one or more sensor signals 220 and (ii) one or more baseline physiological profiles 210. If the processor 204 predicts both the activity and emotional state from the one or more baseline profiles 210, then in steps 612 to 616, the processor 204 can further analyze the activity and emotional state to recommend a course of action to the first individual 104.
  • the processor 204 can attempt in steps 620-638 to predict one or both of the activity and emotional state using other methods. If one or both of the activity and emotional state are predicted using other methods, then the processor 204 can update the baseline physiological profiles 210. The updated baseline physiological profiles 210 can then be used during another iteration of the method 600, and at a subsequent time period when the first individual 104 is engaged in the same activity, to predict the activity and emotional state.
  • step 602 the receiver 202 of the computing device 200 receives one or more sensor signals 220 from the one or more
  • each sensor signal 220 characterizes a physiological property of the first individual 104 during a time period.
  • the first characterization can be raw sensor data or may be interpreted sensor data such as (without limitation) maximum and minimum values, an average value, a median value, or even an instantaneous value as described above.
  • the processor 204 monitors the one or more sensor signals 220 to detect a change in the one or more sensor signals 220 that is indicative of a change in one or both of the activity in which the first individual 104 is engaged and the emotional state of the first individual 104. If it is determined in step 606 that the processor 204 did not detect such a change, then the computing device 200 can repeat steps 602 to 606 until a change is detected.
  • step 702 the processor 204 can interpret the one or more sensor signals 220 received from the one or more physiological sensors 108a-b, assuming that the one or more sensor signals 220 have not already been interpreted. For example, the processor 204 can determine maximum and minimum values, an average value, a median value, or even an instantaneous value from raw sensor data in the event that one or more sensor signals 220 contain raw sensor data.
  • step 704 the processor 204 retrieves a physiological profile from the memory 206, and in step 706, the processor 204 compares the retrieved physiological profile to the one or more sensor signals 220.
  • the physiological profile can be for example, a baseline profile 210 for the first individual 104 in a relaxed state. Accordingly, the processor 204 can compare the received one or more sensor signals 220 to the baseline profile 210 to determine when the first individual 104 deviates from a relaxed state.
  • the physiological profile can also be a time-indexed profile 212 for the most recently detected activity and/or emotional state. Accordingly, the processor 204 can compare the received one or more sensor signals 220 to the time-indexed profile 212 to determine when the first individual 104 deviates from the most recently detected activity and/or emotional state.
  • step 708 determines in step 708 that the sensor signal 220 matches the retrieved physiological profile
  • the processor 204 concludes in step 712 that the first individual's activity and/or emotional state has not changed. If on the other hand, the processor 204 determines that the sensor signal 220 does not match the retrieved physiological profile, then the processor 204 concludes in step 710 that the activity and/or emotional state has changed.
  • the term "match” does not require an exact match but can be a substantial match, i.e., a match of greater than 50%.
  • step 606 when a change in activity and/or emotional state is detected in step 606, the processor 204 proceeds to step 608 to attempt to determine an expected activity and an expected emotional state for the time period when the change occurred.
  • the processor 204 performs a comparison based on the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206. For each baseline physiological profile 210 considered, the processor 204 performs a comparison based on a characterization of at least one physiological property stored in the baseline physiological profile 210 and at least one of the sensor signals 220.
  • the comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
  • one or more of an activity and an emotional state can be detected by considering a single physiological property. In other cases, one or more of an activity and an emotional state can be detected by considering multiple physiological properties. For example, if the one or more sensors 108 includes a heart rate sensor and an accelerometer, then physical activity such as running can be distinguished from stress. Thus, if heart rate increases but the accelerometer indicates that the first individual 104 is not moving, then the processor 204 can conclude that the heart rate change is due to stress not exercise. If, on the other hand, heart rate increases and the accelerometer indicates that the first individual 104 is moving, then the processor 204 can conclude that the heart rate change is due to exercise not stress.
  • a simplified flow diagram is shown of a method 800 according to one embodiment that may be used to implement step 608.
  • the processor 204 may perform the method 800 using baseline physiological profiles 210 such as those shown in Fig. 3.
  • the processor 204 selects and retrieves a baseline physiological profile 210 from the memory 206.
  • the processor 204 performs a comparison based on a characterization of at least one physiological property stored in the selected baseline physiological profile 210 and at least one of the sensor signals 220.
  • the comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
  • step 806 If the processor 204 determines in step 806 that the at least one sensor signal 220 does not match the characterization of the at least one physiological property stored in the selected baseline profile 210, then the processor 204 repeats steps 802 to 806 until either the processor 204 finds a match in step 806 or the processor 204 has considered all of the baseline physiological profiles 210 in the memory 206 in step 812. If all baseline profiles 210 are considered without finding a match, then the processor 204 concludes in step 814 that it did not predict the activity or emotional state. In this case, there are no baseline physiological profiles 210 stored in the memory 206 for the activity and emotional state. If, and when, the processor 204 determines that the at least one sensor signal 220 matches the characterization of the at least one physiological property stored in the selected baseline physiological profile 210, the processor 204 proceeds to step 808.
  • step 808 the processor 204 determines if one or both of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210. If the processor 204 determines that both of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210, then, in step 810, the processor 204 predicts that the activity engaged in by the first individual 104 is the stored expected activity and the emotional state of the first individual 104 is the stored expected emotional state. In this case, the processor 204 has arrived at a complete baseline physiological profile 210.
  • step 808 the processor 204 determines that only one of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210, then, in step 810, the processor 204 concludes that the activity engaged in by the first individual 104 is the stored expected activity or the emotional state of the first individual 104 is the stored expected emotional state. In this case, the processor 204 has arrived at an incomplete baseline
  • a simplified flow diagram is shown of a method 900 according to another embodiment that may be used to implement step 608.
  • the processor 204 may perform the method 900 based on baseline physiological profiles 210 such as those shown in Fig. 4. In general, for each physiological property considered, the method 900 performs a comparison based on the baseline physiological profiles 210 for the physiological property and one or more the sensor signals 220 for the physiological property. In so doing, the processor 204 may arrive at multiple possible expected activities and multiple possible expected emotional states for each physiological property. The processor 204 may then predict one or both of the activity and emotional state by finding a possible expected activity and/or possible expected emotional state that is common to the physiological properties considered.
  • step 902 the processor 204 selects a physiological property to analyze that corresponds to one or more of the physiological sensors 108a-b.
  • step 904 the processor 204 retrieves a baseline physiological profile 210 from the memory 206 for the selected physiological property.
  • step 906 the processor 204 performs a comparison based on a characterization of the selected physiological property stored in the selected baseline physiological profile 210 to at least one of the sensor signals 220.
  • the comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
  • step 908 determines in step 908 that the at least one sensor signal 220 does not match the selected physiological property stored in the selected baseline profile 210
  • the processor 204 repeats steps 904 to 908 until either the processor 204 finds a match in step 908 or the processor 204 determines in step 914 that all of the baseline physiological profiles 210 stored in memory 206 for the selected physiological property have been considered. If all profiles for the selected physiological property are considered without finding a match, then the processor 204 concludes in step 916 that it did not predict the activity or emotional state for the physiological property.
  • step 910 the processor 204 determines if one or more expected activities and/or one or more expected emotional states are stored in the selected baseline physiological profile 210. If one or more expected activities and/or one or more expected emotional states are stored in the selected baseline profile 210, then the processor 204 concludes in step 912 that the one or more expected activities and/or one or more expected emotional states are possible matches. If, on the other hand, it is determined in in step 910 that neither of an expected activity nor an expected emotional state is stored, then the processor 204 concludes in step 916 that it did not predict the activity or emotional state for the physiological property.
  • a next physiological property can be selected in step 920, although in some embodiments, the processor 204 might consider only one physiological property. In the event that the processor considers more than one physiological property, steps 902-918 are repeated for the next physiological property.
  • the processor 204 analyzes in step 922 the possible matches determined in step 912 for each physiological property to determine if a single expected activity and/or a single expected emotional state can be found that is common to all of the physiological properties. If the processor 204 determines that there is a common expected activity, then the processor 204 predicts in step 924 that the activity engaged in by the first individual 104 is the common expected activity. Similarly, if the processor 204 determines that there is a common expected emotional state, then the processor 204 predicts in step 924 that emotional state of the first individual 104 is the common emotional state. If there is not a common expected activity or expected emotional state, then the processor 204 concludes in step 926 that it has not predicted the activity or emotional state of the first individual 104.
  • step 610 if it is determined in step 610 that the processor 204 successfully predicted both an expected activity and an expected emotional state, then the processor 204 can present feedback in the form of a notification 224 to the first individual 104 in step 614. In so doing, the processor 204 can simply disclose the expected activity and expected emotional state to the first individual 104. Alternatively, the processor 204 can further analyze the predicted activity and emotional state in step 612 and provide more detailed feedback to the first individual 104 in step 614. For example, step 612, the processor 204 can attempt to determine if one or more of the activity and emotional state expected in step 608 matches one or more of an observed activity and observed emotional state observed by another method.
  • One such method includes, in step 612, detecting at least one of an observed activity and an observed emotional state through unsolicited media information, and comparing the at least one of the observed activity and observed emotional state to the at least one expected activity and expected emotional state predicted in step 608.
  • a flow diagram is shown of a method 1000 according to one embodiment that can be used to implement step 612 in Fig. 6.
  • the processor 204 retrieves one or more of unsolicited network- accessible media information 222 from the communications network 102 and unsolicited locally- stored media information 218 from the memory 206.
  • unsolicited network-accessible media information 222 such as information stored on a password-protected website, the processor 204 can be configured to log onto the website.
  • the processor 204 analyzes the unsolicited information using, for example, semantic analysis of text or audio, including natural language processing, machine learning, and image analysis for pictures and video, and geolocation.
  • the processor 204 analyzes the unsolicited media information to detect one or both of an activity and emotional state, and a time period for the activity and/or emotional state.
  • the processor 204 can further analyze the unsolicited information to detect who generated the information and whether or not the information is relevant to the first individual 104, in the event that the processor 204 considers unsolicited information generated by a third-party (e.g., second individual 110).
  • a third-party e.g., second individual 110
  • Retrieval and analysis of the unsolicited media information can occur upon detecting a change in activity and/or emotional state based on the one or more sensor signals 220.
  • analysis of the unsolicited media information e.g., a calendar appointment or a posting about an upcoming event
  • the information can be stored until a later time period when one or more sensor signals 220 or additional media are received for the time period.
  • natural language processing and machine learning can identify keywords, triggers, or word strings from text or audio of the unsolicited media.
  • the identified keywords, triggers, or word strings can be identified from one or more of (i) a subject, action, or object of the string to show the relationship to the user, (ii) adjectives, verbs, adverbs, and nouns, (iii) punctuation of the string including emoticons, and (iv) connected strings tied to an event.
  • Each piece of data from the unsolicited media can be used to provide ranking, weighting, tone, and positive or negative sentiment to the analyzed information.
  • weighting can be modified to specific areas of the sentence such as "worst,” which would have higher weighting or impact on an activity and/or emotional state compared to "bad".
  • step 1006 If it is determined in step 1006 that the processor 204 does not detect an activity or emotional state from the unsolicited media information, or if it is determined in step 1008 that the time period detected for the unsolicited media information does not match the time period for the one or more sensor signals 220, then the processor 204 concludes in step 1010 that it did not detect an observed activity or observed emotional state for the time period. If, on the other hand, the processor 204 determines that the time period matches, and the processor detects one or more of an observed emotional state and observed activity, then the processor 204 proceeds to step 1012.
  • the processor 204 determines if the one or more of the observed emotional state and observed activity concerns the first individual 104. In other words, the processor 204 determines if the one or more of the observed emotional state and observed activity characterizes one or more of an emotional state and activity of the first individual 104 as opposed to someone else. For example, the processor 204 can determine if a posting by the first individual 104 or a third party (in the event that third party information is considered) describes an activity and or emotional state of the first individual 104 rather than a third party. If it is determined in step 1012 that the unsolicited media information concerns the first individual 104, then the processor 204 concludes in step 1014 that it has determined an observed activity and/or observed emotional state. If, on the other hand, it is determined in step 1012 that the unsolicited media information does not concern the first individual 104, then the processor 204 concludes that it has not determined an observed activity and/or emotional state.
  • the processor 204 can detect at least one of a similarity and a difference between (i) the at least one of the expected emotional state and the expected activity and (ii) the at least one of the observed emotional state and the observed activity. The processor 204 can then generate a notification to present to the first individual in step 616 based on the at least one of the similarity and difference.
  • the processor 204 can simply notify the first individual 104 that the observed activity and/or emotional state matches the expected activity and/or emotional state.
  • the processor 204 can generate more specific feedback.
  • the processor 204 can provide coaching in the form of incremental steps to be performed by the first individual 104 over time to accomplish a task or reach a goal. For instance, when the first individual 104 is happy, the processor 204 can provide positive feedback to reinforce the happy or excited emotions. If, on the other hand, the user is angry or agitated, the processor 204 can provide feedback on how to modify the behavior (e.g., provide breathing techniques) or even notify the user that they are exhibiting characteristics of the emotional state.
  • the processor 204 may provide feedback in the form of recommendations to continue a behavior or discontinue a behavior altogether.
  • the processor 204 may provide an alert or warning in the event that a positive or negative threshold is reached.
  • the processor 204 can notify the first individual 104 that a target heart rate has been exceeded.
  • the processor 204 may provide feedback in the form of goals or targets achieved. For instance, the processor 204 may provide feedback when the first individual 104 reaches a goal of running four times in a week.
  • the processor 204 can attempt to determine one or more reasons for the difference. For example, the processor 204 can identify differences between an activity and an expected activity. For instance, suppose that the first individual 104 had a bad run. The processor 204 can determine by analyzing the baseline profiles 210 and time- indexed profiles 212 stored in the memory 206 that the run was longer than a typical run, that the first individual 104 had a meal closer to the run than typical, or that the first individual 104 had a poor night sleep the night before. These possibilities can then be presented to the first individual 104 in step 616. As another example, the processor 204 can request feedback from the first individual 104 as to the reasons for the difference between the activity and the expected activity. As yet another example, the processor 204 can wait until further information is available (e.g., via one of media information 218 or 222) to determine the reason for the difference between the activity and the expected activity.
  • further information is available (e.g., via one of media information 218 or 222) to determine the reason for the difference between the activity and the expected
  • a new time-indexed profile 212 can be stored in the memory 206.
  • the new time-indexed profile 212 can include the one or more sensor signals 220, the time period for the one or more sensor signals 220, the expected activity, the expected emotional state, and optionally one or more of an observed activity and observed emotional state if determined in step 612.
  • This information can be used develop a trend over time as to the first individual's feelings toward the activity.
  • This trend can then be used at a later time to update a baseline physiological profile 210 in the memory 206 to reflect the first individual's feelings toward the activity.
  • the trend can be used to reinforce the baseline profile 210 when the observed activity and emotional state repeatedly match the expected activity and emotional state.
  • the trend can be used to change the baseline profile 210 to reflect a new emotional state for an activity when the observed emotional state repeatedly differs from the expected emotional state.
  • the notification 224 generated by the processor 204 can be presented to the first individual 104.
  • the notification 224 can be presented to the first individual 104 through notification device 214 of the same device 200 that analyzes the activity and emotional state.
  • the notification 224 can be presented to the first individual 104 using a separate device. For example, suppose that the device 200 implements the server 116 of Fig. 1. In this case, the server 116 can provide the notification 224 to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can present the notification to the first individual 104.
  • the notification 224 can simply inform the first individual 104 of one or both of the activity and emotional state detected in step 608. Alternatively, or in addition, the notification 224 can provide more detailed feedback to the first individual 104 as described above. For example, the notification 224 can inform the first individual 104 that the first individual 104 a detected interaction with a specific individual (e.g., the second individual 110) caused the first individual 104 to be stressed. As another example, the processor 204 can determine, by reviewing the database of time-indexed physiological profiles, that the first individual 104 was stressed during several past interactions with the specific individual.
  • the processor 204 can inform the first individual 104 of the past interactions, and can recommend that the first individual 104 alter his or her interactions with the specific individual or avoid interactions with the individual altogether. Additionally feedback can be provided to other individuals (e.g., second individual 110) about the state of the first individual 104 and whether to approach the first individual 104 based upon the emotional state of the first individual. Feedback can also be provided to other individuals regarding how to change the first individual's emotional state based upon past activities that have successfully changed the first individual's emotional state.
  • step 620 the time period, the one or more received sensor signals 220 for the change, and any activity or emotional state detected in step 608 are stored in the memory 206. These items can be stored by, for example, creating a new time-indexed physiological profile 212.
  • step 622 the processor 204 attempts to determine one or more of an observed activity and an observed emotional state through unsolicited media information.
  • Step 622 can be implemented in a manner similar to that described above in relation to Fig. 10.
  • the time period stored in step 620 can be compared in step 1008 of Fig. 10 to the time period determined from the unsolicited media information in step 1004 of Fig. 10. If it is determined in step 624 that the processor 204 detected one or both of an observed activity and an observed emotional state, then the observed activity and/or observed emotional state detected can be stored in the memory 206 in step 626.
  • the baseline physiological profile 210 can be updated to store the observed activity as the expected activity.
  • the baseline physiological profile 210 can be updated to store the observed emotional state as the expected emotional state.
  • a new time-indexed physiological profile 212 can be created if not already created in step 620 to store the observed activity and/or observed emotional state.
  • the new time-indexed physiological profile 212 can be updated to store the observed activity and/or observed emotional state.
  • step 624 the processor 204 can attempt other methods to detect one or both of an observed activity and an observed emotional state.
  • the processor 204 can generate a notification 224 to solicit input from the first individual 104.
  • the notification 224 can be presented to the first individual 104 through at least one notification device 214 of the same device 200 that analyzes the activity and emotional state.
  • the notification 224 can be presented to the first individual 104 a notification device that is separate from the device 200.
  • the server 116 can provide the notification 224 to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can present the notification 224 to the first individual 104.
  • the first individual 104 can respond to the notification 224 by inputting one or both of an activity and an emotional state into at least one input device.
  • Each input device can be implemented in the same computing device 200 that analyzes the activity and emotional state or can be implemented separate from the computing device 200 that analyzes the activity and emotional state.
  • the device 200 implements the server 116 of Fig. 1, then the first individual 104 can provide input to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can in turn provide the input to the server 116.
  • step 632 If it is determined in step 632 that another method has detected one or both of an observed activity and an observed emotional state, then the observed activity and/or observed emotional state can be stored in step 634 as discussed above in relation to step 626. If, on the other hand, it is determined in step 632 that the processor 204 has not detected one or both of an observed activity and an observed emotional state by another method, then a baseline
  • physiological profile 210 can be stored in memory 206 in step 636 with the sensor signal 220 but without one or more of the expected activity and expected emotional state.
  • the processor 204 can optionally provide feedback in the form of a notification 224 to the first individual 104. In so doing, the processor 204 can simply disclose the expected activity and emotional state, if determined, to the first individual 104. Notification can be made using at least one notification device as discussed above in relation to step 616. Feedback can also be requested from other individuals (e.g., the second individual 110) to confirm the first individual's emotional state and activities. For example, another individual (e.g., the second individual 110) can be prompted to input whether or not the first individual 104 is also participating in an activity.
  • step 618 the processor 204 can continue monitoring the one or more sensor signals 220, in which case the processing returns to step 602, and method 600 is repeated.
  • the processor 204 can stop monitoring the one or more sensor signals 220 (e.g., if the device 220 is turned off). If the processor 204 supplements or completes a baseline physiological profile 210 during an iteration of method 600, then the supplemented or completed baseline physiological profile 210 can be used in a subsequent iteration of the method 600.
  • Fig. 6 shows a method wherein the processor 204 detects changes in one or more of an activity and emotional state by analyzing one or more sensor signals 220.
  • changes in one or more of an activity and emotional state can be detected using other methods. For example, a change in one or more of an activity and an emotional state can be detected based on unsolicited media information. The change in activity and/or emotional state can then be correlated with one or more sensor signals 220 received before, concurrently with, or after the generation of the unsolicited media information. The correlation can be performed by detecting a time period for the activity and/or emotional state and matching the time period to a time period of the sensor signals 220. As an example, consider Fig. 1 1.
  • Fig. 11 shows as simplified flow diagram of another method 1100 that can be implemented by the device 200 of Fig. 2 to analyze changes in the activity and emotional state of the first individual 104 over time.
  • the processor 204 retrieves one or more of unsolicited network-accessible media information 222 and unsolicited locally- stored media information 218.
  • unsolicited network-accessible media information 222 such as information stored on a password-protected website
  • the processor 204 can be configured to log onto the website if needed.
  • step 1104 the processor 204 analyzes the unsolicited information in a manner similar to that discussed above in relation to step 1004 of Fig. 10 (e.g., semantic analysis of text or audio, image analysis for pictures and video, and geolocation). If it is determined in step 1106 that the processor 204 did not detect an activity or emotional state from the unsolicited media information, or if it is determined in step 1108 that processor 204 did not detect a time period for the unsolicited media information, then the processor 204 concludes in step 1112 that it did not detect an observed activity or observed emotional state. In this case, the processor 204 can continue or discontinue monitoring the unsolicited media information in step 1128. If, the processor continues monitoring, then processing returns to step 1102. If, on the other hand, the processor 204 detects one or more of an observed emotional state and observed activity, and the processor 204 determines that the time period matches in step 1108, then the processor 204 proceeds to step 1110.
  • the processor 204 detects one or more of an observed emotional state and observed activity, and the processor
  • the processor 204 determines if the unsolicited media information concerns the first individual 104. For example, the processor 204 can determine if a posting by the first individual 104 or a third party (in the event that third party information is considered) describes an activity and or emotional state of the first individual 104 or a third party. If the observed activity and/or observed emotional state does not concern the first individual 104, then the processor 204 concludes in step 1112 that it has not determined an observed activity and/or emotional state. If, on the other hand, the unsolicited media information concerns the first individual 104, then, in step 1116, the processor 204 attempts to find a time-indexed profile 212 in the memory 206. In particular, the processor 204 searches for a time-indexed profile 212 that stores a time period that matches the time period determined by the processor 204 in step 1104.
  • step 1120 If it is determined in step 1120 that the processor 204 did not identify a matching time-indexed profile 212 in the memory 206, then the processor 204 can create a new time-indexed profile 212 in step 1118 and store the newly-created profile 212 in the memory 206.
  • the newly-created profile 212 is incomplete and does not include one or more sensor signals 220, an expected activity, or an expected emotional state.
  • the newly-created profile 212 can be completed at a subsequent time period by performing, for example, the method 600 in Fig. 6.
  • step 1120 If it is determined in step 1120 that the processor 204 did identify a matching time-indexed profile 212 in the memory 206, then the processor 204 can update the time-indexed profile 212 to store the observed activity and/or observed emotional state in step 1122. The time-indexed profile 212 can then be used in step 1124 to create a new baseline profile 210 if one does not exist for the activity and/or emotional state.
  • the processor 204 can present feedback in the form of a notification 224 to the first individual 104.
  • the notification can be presented by at least one notification device as described above in relation to step 616 of Fig. 6.
  • the notification 224 can simply inform the first individual 104 of any of an expected activity, an expected emotional state, an observed activity, and an observed emotional state stored in the newly created or updated time-indexed profile 212. Alternatively, or in addition, the notification 224 can provide more detailed feedback to the first individual 104.
  • the notification 224 can inform the first individual 104 of the discrepancy.
  • the notification 224 can also suggest reasons why the observed activity and/or emotional state differed from respective expected activity and emotional state. For example, suppose that the first individual 104 went for a run, and that typically when the first individual goes for a run, the first individual 104 feels good, and consequently, the expected emotional state is happy. Now suppose that the first individual 104 posts on social media that he or she "feels terrible after my run.” In this case, the observed emotional state for the run is unhappy.
  • the notification 224 can present differences between the run and a typical run. For example, the notification 224 can inform the first individual 104 that the run was longer than on a typical run, the temperature was colder than on a typical run, the first individual 104 had low energy before the run, the first individual 104 did not sleep well before the run, the first individual 104 ate right before the run, and so on.
  • the invention can be embodied in the form of methods and apparatuses for practicing those methods.
  • the invention can also be embodied in the form of program code embodied in tangible media, such as magnetic recording media, optical recording media, solid state memory, floppy diskettes, CD-ROMs, hard drives, or any other non-transitory machine- readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
  • the invention can also be embodied in the form of program code, for example, stored in a non- transitory machine-readable storage medium including being loaded into and/or executed by a machine, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
  • program code segments When implemented on a general-purpose processor, the program code segments combine with the processor to provide a unique device that operates analogously to specific logic circuits.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Veterinary Medicine (AREA)
  • Psychiatry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Physiology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Cardiology (AREA)
  • Dentistry (AREA)
  • Pulmonology (AREA)
  • Educational Technology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

According to one aspect of the disclosure, a processor receives a sensor signal from a physiological sensor that characterizes a physiological property of the individual during the activity. The processor retrieves an incomplete physiological profile from memory that includes a characterization of the physiological property of the individual and one of an emotional state and an activity. The processor identifies a match between the sensor signal and the physiological profile by performing a comparison based on the sensor signal and the characterization of the physiological property. Further, the processor analyzes unsolicited media information about the individual retrieved from memory to determine a different one of the emotional state and the activity. The processor then stores in memory a complete physiological profile comprising both the emotional state and the activity.

Description

ANALYZING EMOTIONAL STATE AND ACTIVITY BASED ON UNSOLICITED MEDIA
INFORMATION
CROSS-REFERENCE TO RELATED APPLICATIONS
100011 This application claims the benefit of U.S. Patent Application Serial No.
62/098,440, filed December 31, 2014, the disclosure of all of which is hereby incorporated by reference as if set for in its entirety herein.
TECHNICAL FIELD
[0002] The present invention relates to health monitoring, and more specifically, but not exclusively, to analyzing emotional state and activity of an individual.
BACKGROUND
[0003] Presently, there are a number of different activity trackers on the market that monitor and track fitness-related activities of a user. An activity tracker is usually a device that includes or is connected to one or more sensors that monitor various aspects of an activity. The one or more sensors can monitor aspects such as distance walked or run, calorie consumption, and in some cases heartbeat and quality of sleep. In some cases, these devices are synced either wired or wirelessly to a computer or smartphone for long-term data tracking. Further, in some cases, these devices are connected wired or wirelessly to a network such that various aspects of the activity can be shared with other individuals on social media websites. BRIEF DESCRIPTION OF THE DRAWINGS
[0004] The foregoing summary, as well as the following detailed description of embodiments of the application, will be better understood when read in conjunction with the appended drawings. For the purposes of illustrating the methods and apparatuses of the present application, there is shown in the drawings representative embodiments. It should be
understood, however, that the application is not limited to the precise methods and apparatuses shown. In the drawings:
[0005] Fig. 1 is a simplified schematic diagram of a system according to one embodiment that is configured to monitor activity and emotional state of an individual;
[0006] Fig. 2 is a simplified schematic diagram of a computing device according to one embodiment that may be used to implement any of the computing devices in the system of Fig. i ;
[0007] Fig. 3 shows Table I, which illustrates an exemplary database of physiological baseline profiles according to one embodiment that may be stored in the memory of the device of Fig. 2;
100081 Fig. 4 shows Table II, which illustrates an exemplary database of physiological baseline profiles according to another embodiment that may be stored in the memory of the device of Fig. 2;
[0009] Fig. 5 shows Table III, which illustrates an exemplary database of time-indexed physiological profiles according to one embodiment that may be stored in the memory of the device of Fig. 2;
[0010] Fig. 6 is simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to analyze an activity engaged in by an individual and an emotional state of the individual during the activity;
[0011] Fig. 7 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to detect a change in a sensor signal;
[0012] Fig. 8 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to predict one or both of an activity engaged in by an individual and an emotional state of the individual during the activity;
[0013] Fig. 9 is a simplified flow diagram of a method according to another
embodiment that may be implemented by the device of Fig. 2 to predict one or both of an activity engaged in by an individual and an emotional state of the individual during the activity; [0014] Fig. 10 is a simplified flow diagram of a method according to one embodiment that may be implemented by the device of Fig. 2 to observe one or both of an activity and an emotional state of an individual using unsolicited media information; and
[0015] Fig. 11 is a simplified flow diagram of a method according to another embodiment that may be implemented by the device of Fig. 2 to analyze an activity engaged in by an individual and an emotional state of the individual during the activity.
DETAILED DESCRIPTION
[0016] The following detailed description should be read with reference to the drawings, in which similar or identical elements in different drawings are identically numbered. The drawings, which are not necessarily to scale, depict selected embodiments and are not intended to limit the scope of the invention. The detailed description illustrates by way of example, not by way of limitation, the principles of the invention. This description will clearly enable one skilled in the art to make and use the invention, and describes several embodiments, adaptations, variations, alternatives and uses of the invention, including what is presently believed to be the best mode of carrying out the invention. Reference herein to "one
embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in the at least one embodiment of the disclosure. The appearances of the phrase "in one embodiment" in various places in the specification do not necessarily all refer to the same embodiment, nor are separate or alternative embodiments necessarily mutually exclusive of other embodiments. The same applies to the term "implementation."
[0017] Various aspects of the disclosure relate to systems, devices, and methods for analyzing not just activities engaged in by an individual, but also the emotional state of the individual while engaging in each activity (i.e., how the individual feels while engaging in the activity). The activities analyzed can include broader categories of activities such as (without limitation) exercising, relaxing, socializing, working, and so on. The activities can in addition or alternatively include more refined activities within each broad category. For example, exercising can include running, walking, biking, and so on; relaxing can include sleeping (including the various stages of rapid eye movement (REM) and non-REM sleep), reading, watching television, and so on; and socializing can include interacting with specific individuals, attending a party, and so on. Similarly, the emotional states analyzed can include broader categories of emotional states and/or more refined emotional states. Broader categories of emotional states can include (without limitation) stressed, relaxed, happy, unhappy, and so on. More refined emotional states can include, for example, terrified, anxious, distressed, depressed, blissful, deeply relaxed, excited, sick, tired, and so on. At least some emotional states can have physical manifestations that can be detected using physiological sensors.
[0018| Upon analyzing the activity and emotional state of an individual, suggestions can be made to enable the individual to alter his or her activities to improve the individual's health. Thus, suggestions can be made to avoid certain activities that tend to cause the individual to be stressed or unhappy, or otherwise adversely affect the individual's health, while other suggestions can be made to engage more frequently in activities that cause the individual to be happy, or otherwise positively affect the individual's health. For instance, stress has been shown to adversely affect health. Stress can exacerbate certain medical conditions and illnesses such as (without limitation) diabetes, and can even lead to, or accelerate, the development of a medical condition or illness. In regards to diabetes, stress has been shown to promote the increase of blood glucose levels of a person with diabetes and has been shown to expedite the onset of diabetes. Stress has also been shown to slow healing after a surgery or injury. However, often an individual is not able to determine if he or she is stressed.
[0019] Referring to Fig. 1, a system 100 is shown that monitors changes in the activity and emotional state of a first individual 104 over time. The system 100 includes one or more computing devices 106 associated with the first individual 104 and one or more sensors 108 associated with the first individual 104 that are in communication with the one or more computing devices 106. The system 100 can further include a communications network 102 such as the internet configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106 and the one or more sensors 108. The system 100 can yet further include a server 116 configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106, the one or more sensors 108, and the communications network 102. The system 100 can yet still further include one or more computing devices 112 and one or more sensors 114 that are (i) associated with a second individual 110 and (ii) configured to communicate via a wired or wireless connection with at least one of the one or more computing devices 106, the one or more sensors 108, the
communications network 102, and the server 116.
[0020] The one or more computing devices 106 associated with the first individual 104 can include any suitable computing device, such as (without limitation) one or more of a personal computer 106a, a mobile phone 106b, a tablet 106c, and a camera 106d. The one or more computing devices 106 can communicate with one another over a local area network or can communicate with one another over a network that includes one or more of the server 116 and the communications network 102.
[0021] The one or more sensors 108 associated with the first individual 104 include one or more physiological sensors 108a-b and can further include one or more environmental sensors 108c. Each of the one or more sensors 108 associated with the first individual 104 can be implemented as part of one or more computing devices 106 or can be implemented separate from the one or more computing devices 106. Each sensor 108 can be a wearable sensor that is worn by (e.g., physically attached to) the first individual 104, a sensor that is otherwise supported by the first individual 104 (e.g., supported in a pocket of clothing worn by the individual), or a sensor that is spaced apart from, but focused on, the first individual 104. In Fig. 1, three sensors, including two physiological sensors 108a and 108b and one environmental sensor 108c, are shown for illustrative purposes. However, the one or more sensors 108 may include as few as one sensor 108 or more than two sensors 108.
[0022 j The one or more physiological sensors 108a-b can include any suitable physiological sensor that characterizes one or more physiological properties of the first individual 104. For example, the one or more physiological sensors 108a-b can include (without limitation) one or more of heart rate sensors, respiration sensors, galvanic skin response sensors, skin temperature sensors, electrocardiography (e.g., ECG or EKG) sensors, electromyography (EMG) sensors, motion sensors, oxygen sensors, blood pressure sensors, and global positioning sensors (GPS). The one or more physiological properties characterized by the one or more physiological sensors can include (without limitation) one or more of heart rate, heart rate variability, respiration, pulse profile, perspiration, skin temperature, blood temperature, electrical activity of the heart, electrical activity of muscles, motion of the individual, acceleration of the individual, body position of the individual, oxygen saturation of the blood, blood pressure, and location. The one or more physiological sensors 108a-b can characterize the one or more physiological properties as (without limitation) one or more of a waveform, a maximum value and minimum value, an average value, a median value, an instantaneous value, and so on.
Examples of physiological sensors include (without limitation) (i) ViSi® Mobile, which is capable of characterizing heart rate, respiration, pulse oximetry, blood pressure, and skin temperature, (ii) BodyMedia® Sensewear®, which is capable of characterizing body motion, temperature, and galvanic skin response, and (iii) Dexcom G4 Platinum®, which is capable of continuously characterizing blood glucose levels.
[0023] The one or more environmental sensors 108c associated with the first individual 104 can include any suitable environmental sensor that characterizes an environmental property in the immediate vicinity of the first individual 104. Thus, the at least one environmental sensor 108c can characterize environmental properties such as (without limitation) one or more of temperature, pressure, speed, quality, and moisture level of the air that is in contact with the first individual 104. The at least one environmental sensor 108c can further characterize one or more of the intensity of light rays that are in contact with the first individual 104 and the level of noise that reaches the first individual's ears.
[0024] The one or more computing devices 112 associated with the second individual 110 can include any suitable computing device, such as (without limitation) one or more of a personal computer 112a, a mobile phone 112b, a tablet 112c, and a camera 112d. The one or more computing devices 112 can communicate with one another over a local area network or can communicate with one another over a network that includes one or more of the server 116 and the communications network 102.
[0025] The one or more sensors 114 associated with the second individual 110 include one or more physiological sensors 114a-b and can further include one or more environmental sensors 114c. Each of the one or more sensors 114 associated with the second individual 110 can be implemented as part of the one or more computing devices 112 or can be implemented separate from the one or more computing devices 112. The one or more physiological sensors 114a-b can include any suitable physiological sensor that characterizes one or more
physiological properties of the second individual 110, including (without limitation) the physiological sensors and physiological properties discussed above in relation to the one or more physiological sensors 108a-b.
[0026] The one or more environmental sensors 114c can include any suitable environmental sensor that characterizes an environment in the immediate vicinity of the second individual 110. Thus, the one or more environmental sensors 114c can characterize (without limitation) one or more of temperature, pressure, speed, and moisture level of the air that is in contact with the second individual 110. The one or more environmental sensors 114c can further characterize one or more of the intensity of light rays that are in contact with the second individual 110 and the level of noise that reaches the second individual's ears. For illustrative purposes, three sensors, including two physiological sensors 114a and 114b and one
environmental sensor 114c, are shown. However, the one or more sensors 114 may include as few as one sensor 114 or more than two sensors 114.
[0027] The server 116 can communicate with one or more of the computing devices 106 and 112 directly or can communicate with one or more of the computing devices 106 and 112 through the communications network 102. The one or more sensors 108 can communicate directly or indirectly with at least one of the one or more computing devices 106, the communications network 102, and the server 116. For example, the one or more sensors 108 can communicate with the server 116 indirectly through the one or more computing devices 106. Similarly, the one or more sensors 114 communicate directly or indirectly with at least one of the one or more computing devices 112, the communications network 102, and the server 116. It will be understood that a direct communication refers to a communication between a first communications device and a second communications device without communicating with an intervening communication device between the first and second communications devices.
Further, it will be understood that an indirect communication refers to a communication between a first communications device and a second communications device in which the communication occurs through an intervening communication device between the first and second
communications devices. As used herein, the term "communications device" refers to a device that transmits and/or receives communications signals, including (without limitation) the computing devices 106 and 112, the sensors 108 and 114, the server 116, and the network 102.
[0028] Referring now to Fig. 2, a computing device 200 is shown according to one embodiment that may be used to implement any of (i) the one or more computing devices 106 and (ii) the server 116. The computing device 200 includes a receiver 202, memory 206, and a processor 204 that is in communication with the receiver 202 and memory 206. The computing device 200 can further include at least one notification device 214 in communication with the processor 204 and/or at least one user input device 216 in communication with the processor 204. The computing device 200 can include one or more of the sensors 108 or some or all of the sensors 108 can be separate from the computing device 200.
[0029] The receiver 202 communicates wired or wirelessly with the one or more physiological sensors 108a-b to receive sensor signals 220. The receiver 202 can further communicate wired or wirelessly with the communications network 102 to receive network- accessible media information 222 that relates to activity and emotional state of the first individual 104. The receiver 202 can yet further communicate wired or wirelessly with the at least one environmental sensor 108c to receive environmental information 228 that characterizes an environmental property perceived by the first individual 104.
[0030] The network-accessible media information 222 includes unsolicited, first-party information meaning that the information (i) is generated by the first individual 104, (ii) concerns an activity and emotional state of the first individual 104, and (iii) is generated without the processor 204 prompting the first individual 104 to characterize either of the activity or emotional state. For example, the unsolicited, first-party information can include (without limitation) information that can be mined over the communications network 102 from one or more of sources external to the computing device 200, including (without limitation) social media websites such as Facebook®, Twitter®, Linkedln®, Pinterest®, Google+®, Reddit®, MySpace®, and Instagram ®, internet email sites such as Gmail® and Hotmail®, exercise websites, food and diet sites such as Yelp and Foursquare®, online ordering, bank accounts, and media streaming sites such as Netflix® and Hulu®.
100311 The network-accessible media information 222 can further include third-party information that (i) is generated by individuals other than the first individual 104, including the second individual 110, and (ii) relates to an activity and emotional state of the first individual 104. The third-party information can include one or both of information that is solicited from the third party and unsolicited information that can be mined over the communications network 102 from one or more sources including (without limitation) the one or more network sources listed above. Further, the third-party information can be mined from the one or more computing devices 112 and from the one or more sensors 114 associated with the second user 110.
[0032] The at least one notification device 214 provides notifications 224 to the first individual 104 concerning activities and emotional states of the first individual 104. The notifications 224 may be displayed by (without limitation) one or more of a visual notification, an audible notification, and a tactile notification. For example, the at least one notification device 214 can include (without limitation) one or more of a display screen that provides the notifications 224 by displaying a message, a light that provides notification by illuminating, a speaker that provides notification by playing an audible sound, a vibration alert motor that provides notification by vibrating the computing device 200, and any other suitable device for notifying the first individual 104. The at least one notification device 214 can also prompt the first individual 104 for information concerning activities and emotional states of the first individual 104.
[0033] The computing device 200 can further include at least one user input device 216 that facilitates the provision of user input 226 from the first individual 104 to the processor 204. The at least one user input device 216 can include (without limitation) one or more of a touchscreen display screen, a keyboard, a keypad, a mouse, a microphone, and any other suitable device that enables the first individual 104 to provide the user input 226. The at least one user input device 216 can be used by the first individual 104 to provide the user input 226 to the processor 204 in response to prompts provided by the processor 204 or by another device such as computing device 106 or a sensor 108. For example, the first individual 104 can respond to prompts displayed by at least one notification device 214 that prompt the first individual 104 for information regarding one or both of the activity and emotional state of the first individual 104 at a particular time period. Information that is input by the first individual 104 in response to prompting by the processor 204 or by another device is referred to herein as solicited, first-party information.
100341 The memory 206 can be, for example, any suitable non-transitory and tangible storage medium. The memory 206 stores a program 208 that is run by the processor 204 to analyze changes in activity and emotional state of the first individual 104. The memory 206 also stores one or more databases for the first individual 104, which includes a database of one or more baseline physiological profiles 210 for the first individual 104 and can include a database of one or more time-indexed physiological profiles 212 for the first individual 104. The memory 206 can further store media information 218 that can be used by the processor 204 to analyze the activity and emotional state of the first individual 104. This locally-stored media information 218 includes information that concerns an activity and emotional state of the individual 104. The locally-stored media information 218 can be solicited information that was produced in response to a prompt generated the processor 204 or unsolicited information that was not produced in response to a prompt generated by the processor 204. Further, the locally-stored media information 218 can include information that is generated by any one of the first individual 104 (i.e., first-party information), another individual (i.e., third-party information), and an application run by the device 200. Examples of the locally-stored media information 218 include (without limitation) internet browsing history, shopping lists, emails, calendar appointments, text messages, pictures, video recordings, and audio recordings.
[0035] Each baseline physiological profile 210 summarizes a typical physiological response for the first individual 104 during at least one activity and at least one emotional state. Each baseline physiological profile 210 can be either complete or incomplete. Each complete baseline physiological profile 210 includes (i) at least one characterization of a physiological property, (ii) at least one expected activity that results in the at least one characterization, and (iii) at least one expected emotional state that is expected during the at least one activity. Each characterization of a physiological property is a mathematical interpretation or representation of the one or more sensor signals 220. The at least one expected activity is an activity engaged in by the first individual 104 that normally results in the at least one characterization of a physiological property. Similarly, the at least one expected emotional state is an emotional state of the first individual 104 that normally results in the at least one characterization of a physiological property. Each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) at least one characterization of a physiological property, (ii) at least one expected activity that results in the at least one characterization, and (iii) at least one expected emotional state that is expected during the at least one activity.
[0036] Referring to Fig. 3, an example of a database 300 of baseline profiles 210 is shown where each row corresponds to one baseline physiological profile 210, the first column corresponds to a baseline index n, where n=l,2, N, the second column corresponds to an expected activity, the third column corresponds to an expected emotional state, and the fourth through (X+3)th columns correspond to different physiological properties x, where x=l, 2, ..., X. Each complete baseline physiological profile 210 includes a single activity and a single emotional state. The database 300 includes N baseline physiological profiles 210, where the number N is greater than one and can optionally increase over time as profiles are added to the database 300. The database 300 includes one or more complete baseline physiological profiles 210, where each complete baseline physiological profile 210 includes an activity, an emotional state, and a threshold or thresholds for a physiological property. In this example, profiles n=l, 2, 3, 6, and 7 are illustrated as being complete. The database 300 can further include one or more incomplete baseline physiological profiles 210, where each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) an activity, (ii) an emotional state, and (iii) a threshold for a physiological property. In this example, profiles n= 4, 5, 8, 9, and N are shown as being incomplete. It will be understood that the profiles n shown in Fig. 3 are merely exemplary, and that the specific expected activities, emotional states, and degree of completion of each profile n may vary in practice. Although not shown, each baseline physiological profile 210 can also store environmental properties typically observed during the corresponding activity.
[0037] Each characterization of a physiological property can include an upper threshold THn(x,U) and a lower threshold THn(x,L) of the physiological property expected for the activity, where x=l, 2, ..., X is the index of the physiological property, U indicates the upper threshold and, L indicates the lower threshold. Alternatively, or in addition, each
characterization can include one or more of raw data, an instantaneous value, a median value, an average value, a percentage change, a standard deviation, a slope or any other suitable interpretation of the physiological property. As an example, suppose that physiological property x=l is heart rate. The database 210 can store upper and lower thresholds THn(l,U) and
THn(l,L) of the heart rate expected when the first user 104 is walking, running, and so forth. Heart rate tends to be higher for running than walking, and therefore, the upper and lower thresholds for running may be higher than those for walking.
[0038] Referring now to Fig. 4, another example of a database 400 of baseline physiological profiles 210 is shown wherein each baseline physiological profile 210 stores a threshold or thresholds for a single physiological property x. In database 400, each row corresponds to one baseline profile 210, the first column corresponds to a baseline index n, where n=l,2, N, the second column corresponds to the physiological property x, where x=l, 2, ..., X, the third column corresponds to one or more expected activities, and the fourth column corresponds to one or more expected emotional states. The database 400 includes N baseline physiological profiles 210, where the number N is greater than one and can optionally increase over time as profiles are added to the database 400. The database 400 includes one or more complete baseline physiological profiles 210, where each complete baseline physiological profile 210 includes a characterization of a single physiological property, at least one expected activity, and at least one expected emotional state. In this example, profiles n=l, 2, and 3 are illustrated as being complete. In some cases, a baseline physiological profile 210 can include more than one expected activity and/or more than one expected emotional state.
[0039] The database 400 can further include one or more incomplete baseline physiological profiles 210, where each incomplete baseline physiological profile 210 excludes one or more, but not all, of (i) a characterization of a physiological property, (ii) at least one expected activity, and (iii) at least one expected emotional state. Each physiological property can be characterized in any of the manners discussed above in relation to Fig. 3. Although not shown, each baseline physiological profile 210 can also store environmental properties characterized by one or more environmental sensors 108c. In this example, profiles n=4 and N are shown as being incomplete. It will be understood that the profiles n shown in Fig. 4 are merely exemplary, and that the specific expected activities, emotional states, and degree of completion of each profile n may vary in practice. Further, although not shown, similar databases may be stored in the memory 206 for other physiological properties.
[0040] Each time-indexed physiological profile 212 summarizes a physiological response for the first individual 104 during a particular time period. Similar to baseline physiological profiles 210, each time-indexed physiological profile 212 can be either complete or incomplete. Referring now to Fig. 5, an embodiment of a database 500 of time-indexed physiological profiles 212 is shown where each row corresponds to one time-indexed profile 212, the first column corresponds to a time period or profile index t, where t=l,2, T, the second column corresponds to a time period, the third column corresponds to an expected activity, the fourth column corresponds to an expected emotional state, and the fifth column corresponds to an observed activity, the sixth column corresponds to an observed emotional state, and the seventh through (X+6)th columns correspond to different physiological properties x, where x=l, 2, ..., X. [00 1 ] Each complete time-indexed physiological profile 212 includes (i) a time period, (ii) at least one characterization of a physiological property recorded during the time period, (iii) an expected activity during the time period, and (iv) an expected emotional state during the time period. In addition, each complete time-indexed physiological profile can include one or more of an observed activity and an observed emotional state that are observed from information retrieved from sources other than the one or more sensor signals 220 (e.g., from network- accessible media information 222, locally-stored media information 218 and/or user input 226) as discussed in further detail below. Although not shown, each time-indexed physiological profile 212 can also store environmental properties characterized by one or more environmental sensors 108c during the time period. Each incomplete time-indexed physiological profile 212 can exclude one or more, but not all, of (i) a time period, (ii) at least one characterization of a physiological property recorded during the time period, (iii) an expected activity during the time period, and (iv) an expected emotional state during the time period.
[0042] Each characterization of a physiological property is a mathematical
interpretation or representation of the one or more sensor signals 220 at the time period t. Each characterization can include a maximum MAXt(x,U) and a minimum MINt(x,L) of the physiological property x observed for the activity during a time period t. Alternatively, or in addition, each characterization can include at least one of raw data, an instantaneous value, a median value, an average value, or any other suitable interpretation of the physiological property x during the time period t.
[0043] Referring back to Fig. 2, in general, the processor 204 receives (i) sensor signals 220 from the one or more physiological sensors 108a-b and (ii) media information from one or more of the communications network 102, the memory 206, and the at least one user input device 216. In one aspect of the disclosure, the processor 204 predicts one or both of an activity and emotional state of the first individual 104 at a time period based on a comparison of the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206. In another aspect, the processor 204 compares one or both of the expected activity and expected emotional state determined from the baseline physiological profiles 210 to an observed activity and observed emotional state, respectively, determined from one or more of the media information 218, 222, and 224. In yet another aspect, the processor 204 provides suggestions to the first individual 104 based on at least one of the expected activity, the expected emotional state, the observed activity, and the observed emotional state. In yet still another aspect, the processor 204 predicts (i) one of an activity and emotional state of the first individual 104 based on a comparison of the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206 and (ii) the other of the activity and emotional state of the first individual 104 based on an observed activity and observed emotional state, respectively, determined from one or more of the media information 218, 222, and 224. In even yet still another aspect of the disclosure, the processor 204 creates new baseline
physiological profiles 210 for newly detected activities and/or emotional states based on the one or more sensor signals 220 and/or one or more of the media information 218, 222, and 224. In even yet still another aspect of the disclosure, the processor 204 supplements incomplete baseline physiological profiles 210 based on the one or more sensor signals 220 and/or one or more of the media information 218, 222, and 224.
[0044] It will be understood that the various aspects of the disclosure can be implemented individually, or one or more of the aspects can be implemented in combination. Thus, embodiments of the disclosure are not limited to implementing all of the aspects of the disclosure, nor are embodiments of the disclosure prohibited from implementing all aspects of the disclosure. Further, it will be understood that the various aspects described above are not all- inclusive, and that other aspects apparent to those skilled in the art are within the scope of this disclosure.
[0045] Although not shown, a computing device similar to computing device 200 can be used to implement any of the one of the computing devices 112 and the server 116 to analyze changes in activities and emotional states of the second individual 110 over time. In such cases, the foregoing description applies similarly to the computing device 112 or server 116, although references to the sensors 108a-c would be replaced with sensors 114a-c and references to the first individual 104 would be replaced with the second individual 110.
[0046] Referring now to Fig. 6, one method 600 is shown that can be implemented by the device 200 of Fig. 2 to analyze changes in the activity and emotional state of the first individual 104 over time. In general, steps 602 to 610 are performed to predict a change in one or both of an activity and emotional state of the first individual 104 based on (i) one or more sensor signals 220 and (ii) one or more baseline physiological profiles 210. If the processor 204 predicts both the activity and emotional state from the one or more baseline profiles 210, then in steps 612 to 616, the processor 204 can further analyze the activity and emotional state to recommend a course of action to the first individual 104. If, on the other hand, the processor 204 does not predict one or both of the activity and emotional state from the one or more baseline profiles 210, then the processor 204 can attempt in steps 620-638 to predict one or both of the activity and emotional state using other methods. If one or both of the activity and emotional state are predicted using other methods, then the processor 204 can update the baseline physiological profiles 210. The updated baseline physiological profiles 210 can then be used during another iteration of the method 600, and at a subsequent time period when the first individual 104 is engaged in the same activity, to predict the activity and emotional state.
[0047] Referring now to individual steps of Fig. 6, in step 602, the receiver 202 of the computing device 200 receives one or more sensor signals 220 from the one or more
physiological sensors 108a-b. Each sensor signal 220 characterizes a physiological property of the first individual 104 during a time period. The first characterization can be raw sensor data or may be interpreted sensor data such as (without limitation) maximum and minimum values, an average value, a median value, or even an instantaneous value as described above. In step 604, the processor 204 monitors the one or more sensor signals 220 to detect a change in the one or more sensor signals 220 that is indicative of a change in one or both of the activity in which the first individual 104 is engaged and the emotional state of the first individual 104. If it is determined in step 606 that the processor 204 did not detect such a change, then the computing device 200 can repeat steps 602 to 606 until a change is detected.
[0048] Referring now to Fig. 7, a simplified flow diagram is shown of a method 700 according to one embodiment that may be used to implement step 604. In step 702, the processor 204 can interpret the one or more sensor signals 220 received from the one or more physiological sensors 108a-b, assuming that the one or more sensor signals 220 have not already been interpreted. For example, the processor 204 can determine maximum and minimum values, an average value, a median value, or even an instantaneous value from raw sensor data in the event that one or more sensor signals 220 contain raw sensor data. In step 704, the processor 204 retrieves a physiological profile from the memory 206, and in step 706, the processor 204 compares the retrieved physiological profile to the one or more sensor signals 220. The physiological profile can be for example, a baseline profile 210 for the first individual 104 in a relaxed state. Accordingly, the processor 204 can compare the received one or more sensor signals 220 to the baseline profile 210 to determine when the first individual 104 deviates from a relaxed state. The physiological profile can also be a time-indexed profile 212 for the most recently detected activity and/or emotional state. Accordingly, the processor 204 can compare the received one or more sensor signals 220 to the time-indexed profile 212 to determine when the first individual 104 deviates from the most recently detected activity and/or emotional state.
[0049] If the processor 204 determines in step 708 that the sensor signal 220 matches the retrieved physiological profile, then the processor 204 concludes in step 712 that the first individual's activity and/or emotional state has not changed. If on the other hand, the processor 204 determines that the sensor signal 220 does not match the retrieved physiological profile, then the processor 204 concludes in step 710 that the activity and/or emotional state has changed. As used herein, the term "match" (and variants on this term) does not require an exact match but can be a substantial match, i.e., a match of greater than 50%.
[0050] Referring back to Fig. 6, when a change in activity and/or emotional state is detected in step 606, the processor 204 proceeds to step 608 to attempt to determine an expected activity and an expected emotional state for the time period when the change occurred. In general, to determine an expected activity and expected emotional state, the processor 204 performs a comparison based on the one or more sensor signals 220 and one or more baseline physiological profiles 210 stored in the memory 206. For each baseline physiological profile 210 considered, the processor 204 performs a comparison based on a characterization of at least one physiological property stored in the baseline physiological profile 210 and at least one of the sensor signals 220. The comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
[0051] In some cases, one or more of an activity and an emotional state can be detected by considering a single physiological property. In other cases, one or more of an activity and an emotional state can be detected by considering multiple physiological properties. For example, if the one or more sensors 108 includes a heart rate sensor and an accelerometer, then physical activity such as running can be distinguished from stress. Thus, if heart rate increases but the accelerometer indicates that the first individual 104 is not moving, then the processor 204 can conclude that the heart rate change is due to stress not exercise. If, on the other hand, heart rate increases and the accelerometer indicates that the first individual 104 is moving, then the processor 204 can conclude that the heart rate change is due to exercise not stress.
[0052] Referring now to Fig. 8, a simplified flow diagram is shown of a method 800 according to one embodiment that may be used to implement step 608. The processor 204 may perform the method 800 using baseline physiological profiles 210 such as those shown in Fig. 3. In step 802, the processor 204 selects and retrieves a baseline physiological profile 210 from the memory 206. In step 804, the processor 204 performs a comparison based on a characterization of at least one physiological property stored in the selected baseline physiological profile 210 and at least one of the sensor signals 220. The comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
[0053] If the processor 204 determines in step 806 that the at least one sensor signal 220 does not match the characterization of the at least one physiological property stored in the selected baseline profile 210, then the processor 204 repeats steps 802 to 806 until either the processor 204 finds a match in step 806 or the processor 204 has considered all of the baseline physiological profiles 210 in the memory 206 in step 812. If all baseline profiles 210 are considered without finding a match, then the processor 204 concludes in step 814 that it did not predict the activity or emotional state. In this case, there are no baseline physiological profiles 210 stored in the memory 206 for the activity and emotional state. If, and when, the processor 204 determines that the at least one sensor signal 220 matches the characterization of the at least one physiological property stored in the selected baseline physiological profile 210, the processor 204 proceeds to step 808.
[0054] In step 808, the processor 204 determines if one or both of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210. If the processor 204 determines that both of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210, then, in step 810, the processor 204 predicts that the activity engaged in by the first individual 104 is the stored expected activity and the emotional state of the first individual 104 is the stored expected emotional state. In this case, the processor 204 has arrived at a complete baseline physiological profile 210. Further, if in step 808, the processor 204 determines that only one of an expected activity and an expected emotional state are stored in the selected baseline physiological profile 210, then, in step 810, the processor 204 concludes that the activity engaged in by the first individual 104 is the stored expected activity or the emotional state of the first individual 104 is the stored expected emotional state. In this case, the processor 204 has arrived at an incomplete baseline
physiological profile 210 that stores a characterization of at least one physiological property and one, but not both, of an expected activity and an expected emotional state. If, in step 808, the processor 204 determines that neither of an expected activity and an expected emotional state is stored in the selected baseline physiological profile 210, then, in step 814, the processor 204 concludes that the processor 204 did not predict the activity or emotional state. In this case, the processor 204 has arrived at an incomplete baseline physiological profile 210 that stores a characterization of at least one physiological property but not an expected activity or an expected emotional state. [0055] Referring now to Fig. 9, a simplified flow diagram is shown of a method 900 according to another embodiment that may be used to implement step 608. The processor 204 may perform the method 900 based on baseline physiological profiles 210 such as those shown in Fig. 4. In general, for each physiological property considered, the method 900 performs a comparison based on the baseline physiological profiles 210 for the physiological property and one or more the sensor signals 220 for the physiological property. In so doing, the processor 204 may arrive at multiple possible expected activities and multiple possible expected emotional states for each physiological property. The processor 204 may then predict one or both of the activity and emotional state by finding a possible expected activity and/or possible expected emotional state that is common to the physiological properties considered.
[0056] In step 902, the processor 204 selects a physiological property to analyze that corresponds to one or more of the physiological sensors 108a-b. In step 904, the processor 204 retrieves a baseline physiological profile 210 from the memory 206 for the selected physiological property. In step 906, the processor 204 performs a comparison based on a characterization of the selected physiological property stored in the selected baseline physiological profile 210 to at least one of the sensor signals 220. The comparison may be performed by comparing the at least one sensor signal 220 directly to the characterization of the at least one physiological property or by comparing an interpretation of the at least one sensor signal 220 such as (without limitation) an instantaneous value, a median value, an average value, or any other suitable interpretation, to the characterization of the at least one physiological property.
[0057] If the processor 204 determines in step 908 that the at least one sensor signal 220 does not match the selected physiological property stored in the selected baseline profile 210, then the processor 204 repeats steps 904 to 908 until either the processor 204 finds a match in step 908 or the processor 204 determines in step 914 that all of the baseline physiological profiles 210 stored in memory 206 for the selected physiological property have been considered. If all profiles for the selected physiological property are considered without finding a match, then the processor 204 concludes in step 916 that it did not predict the activity or emotional state for the physiological property.
100581 If, and when, the processor 204 determines in step 908 that the at least one sensor signal 220 matches the physiological property stored in a selected baseline physiological profile 210, the processor 204 proceeds to step 910. In step 910, the processor 204 determines if one or more expected activities and/or one or more expected emotional states are stored in the selected baseline physiological profile 210. If one or more expected activities and/or one or more expected emotional states are stored in the selected baseline profile 210, then the processor 204 concludes in step 912 that the one or more expected activities and/or one or more expected emotional states are possible matches. If, on the other hand, it is determined in in step 910 that neither of an expected activity nor an expected emotional state is stored, then the processor 204 concludes in step 916 that it did not predict the activity or emotional state for the physiological property.
[0059] Once all baseline physiological profiles 210 have been considered or once a match has been found, a next physiological property can be selected in step 920, although in some embodiments, the processor 204 might consider only one physiological property. In the event that the processor considers more than one physiological property, steps 902-918 are repeated for the next physiological property.
[0060] Once all physiological properties have been considered, the processor 204 analyzes in step 922 the possible matches determined in step 912 for each physiological property to determine if a single expected activity and/or a single expected emotional state can be found that is common to all of the physiological properties. If the processor 204 determines that there is a common expected activity, then the processor 204 predicts in step 924 that the activity engaged in by the first individual 104 is the common expected activity. Similarly, if the processor 204 determines that there is a common expected emotional state, then the processor 204 predicts in step 924 that emotional state of the first individual 104 is the common emotional state. If there is not a common expected activity or expected emotional state, then the processor 204 concludes in step 926 that it has not predicted the activity or emotional state of the first individual 104.
[0061] Referring back to Fig. 6, if it is determined in step 610 that the processor 204 successfully predicted both an expected activity and an expected emotional state, then the processor 204 can present feedback in the form of a notification 224 to the first individual 104 in step 614. In so doing, the processor 204 can simply disclose the expected activity and expected emotional state to the first individual 104. Alternatively, the processor 204 can further analyze the predicted activity and emotional state in step 612 and provide more detailed feedback to the first individual 104 in step 614. For example, step 612, the processor 204 can attempt to determine if one or more of the activity and emotional state expected in step 608 matches one or more of an observed activity and observed emotional state observed by another method.
[0062] One such method includes, in step 612, detecting at least one of an observed activity and an observed emotional state through unsolicited media information, and comparing the at least one of the observed activity and observed emotional state to the at least one expected activity and expected emotional state predicted in step 608. Referring now to Fig. 10, a flow diagram is shown of a method 1000 according to one embodiment that can be used to implement step 612 in Fig. 6. In step 1002, the processor 204 retrieves one or more of unsolicited network- accessible media information 222 from the communications network 102 and unsolicited locally- stored media information 218 from the memory 206. To retrieve unsolicited network-accessible media information 222 such as information stored on a password-protected website, the processor 204 can be configured to log onto the website.
100631 In step 1004, the processor 204 analyzes the unsolicited information using, for example, semantic analysis of text or audio, including natural language processing, machine learning, and image analysis for pictures and video, and geolocation. The processor 204 analyzes the unsolicited media information to detect one or both of an activity and emotional state, and a time period for the activity and/or emotional state. The processor 204 can further analyze the unsolicited information to detect who generated the information and whether or not the information is relevant to the first individual 104, in the event that the processor 204 considers unsolicited information generated by a third-party (e.g., second individual 110).
Retrieval and analysis of the unsolicited media information can occur upon detecting a change in activity and/or emotional state based on the one or more sensor signals 220. Alternatively, or in addition, in some cases analysis of the unsolicited media information (e.g., a calendar appointment or a posting about an upcoming event) can occur before a change has been detected, and then the information can be stored until a later time period when one or more sensor signals 220 or additional media are received for the time period.
[0064] As is known to those skilled in the art, natural language processing and machine learning can identify keywords, triggers, or word strings from text or audio of the unsolicited media. The identified keywords, triggers, or word strings can be identified from one or more of (i) a subject, action, or object of the string to show the relationship to the user, (ii) adjectives, verbs, adverbs, and nouns, (iii) punctuation of the string including emoticons, and (iv) connected strings tied to an event. Each piece of data from the unsolicited media can be used to provide ranking, weighting, tone, and positive or negative sentiment to the analyzed information. For example, suppose that the first individual 104 posts on a microblog (e.g., twitter®) that "I had the worst day at work!" Multiple natural language processing methods can be used to analyze the post, such as sentiment analysis, topic analysis, named entity recognition, optical character recognition, and relationship extraction. Weighting can be modified to specific areas of the sentence such as "worst," which would have higher weighting or impact on an activity and/or emotional state compared to "bad". [0065] If it is determined in step 1006 that the processor 204 does not detect an activity or emotional state from the unsolicited media information, or if it is determined in step 1008 that the time period detected for the unsolicited media information does not match the time period for the one or more sensor signals 220, then the processor 204 concludes in step 1010 that it did not detect an observed activity or observed emotional state for the time period. If, on the other hand, the processor 204 determines that the time period matches, and the processor detects one or more of an observed emotional state and observed activity, then the processor 204 proceeds to step 1012.
[0066] In step 1012, the processor 204 determines if the one or more of the observed emotional state and observed activity concerns the first individual 104. In other words, the processor 204 determines if the one or more of the observed emotional state and observed activity characterizes one or more of an emotional state and activity of the first individual 104 as opposed to someone else. For example, the processor 204 can determine if a posting by the first individual 104 or a third party (in the event that third party information is considered) describes an activity and or emotional state of the first individual 104 rather than a third party. If it is determined in step 1012 that the unsolicited media information concerns the first individual 104, then the processor 204 concludes in step 1014 that it has determined an observed activity and/or observed emotional state. If, on the other hand, it is determined in step 1012 that the unsolicited media information does not concern the first individual 104, then the processor 204 concludes that it has not determined an observed activity and/or emotional state.
[0067] Referring back to Fig. 6, if at least one of an observed emotional state and an observed activity, then, in step 612, the processor 204 can detect at least one of a similarity and a difference between (i) the at least one of the expected emotional state and the expected activity and (ii) the at least one of the observed emotional state and the observed activity. The processor 204 can then generate a notification to present to the first individual in step 616 based on the at least one of the similarity and difference.
[00681 If a similarity is detected, then the processor 204 can simply notify the first individual 104 that the observed activity and/or emotional state matches the expected activity and/or emotional state. Alternatively, the processor 204 can generate more specific feedback. For example, the processor 204 can provide coaching in the form of incremental steps to be performed by the first individual 104 over time to accomplish a task or reach a goal. For instance, when the first individual 104 is happy, the processor 204 can provide positive feedback to reinforce the happy or excited emotions. If, on the other hand, the user is angry or agitated, the processor 204 can provide feedback on how to modify the behavior (e.g., provide breathing techniques) or even notify the user that they are exhibiting characteristics of the emotional state.
[0069] As another example, the processor 204 may provide feedback in the form of recommendations to continue a behavior or discontinue a behavior altogether. As yet another example, the processor 204 may provide an alert or warning in the event that a positive or negative threshold is reached. For instance, the processor 204 can notify the first individual 104 that a target heart rate has been exceeded. As yet still another example, the processor 204 may provide feedback in the form of goals or targets achieved. For instance, the processor 204 may provide feedback when the first individual 104 reaches a goal of running four times in a week.
[0070] If a difference is detected, then the processor 204 can attempt to determine one or more reasons for the difference. For example, the processor 204 can identify differences between an activity and an expected activity. For instance, suppose that the first individual 104 had a bad run. The processor 204 can determine by analyzing the baseline profiles 210 and time- indexed profiles 212 stored in the memory 206 that the run was longer than a typical run, that the first individual 104 had a meal closer to the run than typical, or that the first individual 104 had a poor night sleep the night before. These possibilities can then be presented to the first individual 104 in step 616. As another example, the processor 204 can request feedback from the first individual 104 as to the reasons for the difference between the activity and the expected activity. As yet another example, the processor 204 can wait until further information is available (e.g., via one of media information 218 or 222) to determine the reason for the difference between the activity and the expected activity.
[0071] In step 614, a new time-indexed profile 212 can be stored in the memory 206. The new time-indexed profile 212 can include the one or more sensor signals 220, the time period for the one or more sensor signals 220, the expected activity, the expected emotional state, and optionally one or more of an observed activity and observed emotional state if determined in step 612. This information can be used develop a trend over time as to the first individual's feelings toward the activity. This trend can then be used at a later time to update a baseline physiological profile 210 in the memory 206 to reflect the first individual's feelings toward the activity. For example, the trend can be used to reinforce the baseline profile 210 when the observed activity and emotional state repeatedly match the expected activity and emotional state. Alternatively, the trend can be used to change the baseline profile 210 to reflect a new emotional state for an activity when the observed emotional state repeatedly differs from the expected emotional state. [0072 j In step 616, the notification 224 generated by the processor 204 can be presented to the first individual 104. The notification 224 can be presented to the first individual 104 through notification device 214 of the same device 200 that analyzes the activity and emotional state. Alternatively, the notification 224 can be presented to the first individual 104 using a separate device. For example, suppose that the device 200 implements the server 116 of Fig. 1. In this case, the server 116 can provide the notification 224 to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can present the notification to the first individual 104.
[0073] The notification 224 can simply inform the first individual 104 of one or both of the activity and emotional state detected in step 608. Alternatively, or in addition, the notification 224 can provide more detailed feedback to the first individual 104 as described above. For example, the notification 224 can inform the first individual 104 that the first individual 104 a detected interaction with a specific individual (e.g., the second individual 110) caused the first individual 104 to be stressed. As another example, the processor 204 can determine, by reviewing the database of time-indexed physiological profiles, that the first individual 104 was stressed during several past interactions with the specific individual.
Therefore, the processor 204 can inform the first individual 104 of the past interactions, and can recommend that the first individual 104 alter his or her interactions with the specific individual or avoid interactions with the individual altogether. Additionally feedback can be provided to other individuals (e.g., second individual 110) about the state of the first individual 104 and whether to approach the first individual 104 based upon the emotional state of the first individual. Feedback can also be provided to other individuals regarding how to change the first individual's emotional state based upon past activities that have successfully changed the first individual's emotional state.
[0074] Referring back to step 610 of Fig. 6, if the processor 204 does not predict both the activity and emotional state, then the processor 204 proceeds to the right side of the flow diagram to attempt in steps 620-638 to add to, update, or supplement the database of baseline physiological profiles 210 for use during a subsequent time period. In step 620, the time period, the one or more received sensor signals 220 for the change, and any activity or emotional state detected in step 608 are stored in the memory 206. These items can be stored by, for example, creating a new time-indexed physiological profile 212.
[0075] In step 622, the processor 204 attempts to determine one or more of an observed activity and an observed emotional state through unsolicited media information. Step 622 can be implemented in a manner similar to that described above in relation to Fig. 10. The time period stored in step 620 can be compared in step 1008 of Fig. 10 to the time period determined from the unsolicited media information in step 1004 of Fig. 10. If it is determined in step 624 that the processor 204 detected one or both of an observed activity and an observed emotional state, then the observed activity and/or observed emotional state detected can be stored in the memory 206 in step 626. Thus, if an observed activity is detected in step 622 and if an expected activity was not detected in step 608, then the baseline physiological profile 210 can be updated to store the observed activity as the expected activity. Similarly, if an observed emotional state is detected in step 622 and if an expected emotional state was not detected in step 608, then the baseline physiological profile 210 can be updated to store the observed emotional state as the expected emotional state. In addition, a new time-indexed physiological profile 212 can be created if not already created in step 620 to store the observed activity and/or observed emotional state.
Alternatively, if a new time-indexed physiological profile 212 was created in step 620, the new time-indexed physiological profile 212 can be updated to store the observed activity and/or observed emotional state.
[0076] If in step 624 it is determined that the processor 204 did not detect either an observed activity or an observed emotional state, or if in step 630 an activity or emotional state has still not been detected, then in step 628, the processor 204 can attempt other methods to detect one or both of an observed activity and an observed emotional state. For example, the processor 204 can generate a notification 224 to solicit input from the first individual 104. The notification 224 can be presented to the first individual 104 through at least one notification device 214 of the same device 200 that analyzes the activity and emotional state. Alternatively, the notification 224 can be presented to the first individual 104 a notification device that is separate from the device 200. For example, suppose that the device 200 implements the server 116 of Fig. 1. In this case, the server 116 can provide the notification 224 to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can present the notification 224 to the first individual 104.
[0077] The first individual 104 can respond to the notification 224 by inputting one or both of an activity and an emotional state into at least one input device. Each input device can be implemented in the same computing device 200 that analyzes the activity and emotional state or can be implemented separate from the computing device 200 that analyzes the activity and emotional state. For example, if the device 200 implements the server 116 of Fig. 1, then the first individual 104 can provide input to one or more of the computing devices 106a to 106d, and the one or more computing devices 106a to 106d can in turn provide the input to the server 116. [0078] If it is determined in step 632 that another method has detected one or both of an observed activity and an observed emotional state, then the observed activity and/or observed emotional state can be stored in step 634 as discussed above in relation to step 626. If, on the other hand, it is determined in step 632 that the processor 204 has not detected one or both of an observed activity and an observed emotional state by another method, then a baseline
physiological profile 210 can be stored in memory 206 in step 636 with the sensor signal 220 but without one or more of the expected activity and expected emotional state. In step 638, the processor 204 can optionally provide feedback in the form of a notification 224 to the first individual 104. In so doing, the processor 204 can simply disclose the expected activity and emotional state, if determined, to the first individual 104. Notification can be made using at least one notification device as discussed above in relation to step 616. Feedback can also be requested from other individuals (e.g., the second individual 110) to confirm the first individual's emotional state and activities. For example, another individual (e.g., the second individual 110) can be prompted to input whether or not the first individual 104 is also participating in an activity.
[0079] In step 618, the processor 204 can continue monitoring the one or more sensor signals 220, in which case the processing returns to step 602, and method 600 is repeated.
Alternatively, in step 618, the processor 204 can stop monitoring the one or more sensor signals 220 (e.g., if the device 220 is turned off). If the processor 204 supplements or completes a baseline physiological profile 210 during an iteration of method 600, then the supplemented or completed baseline physiological profile 210 can be used in a subsequent iteration of the method 600.
[0080] Fig. 6 shows a method wherein the processor 204 detects changes in one or more of an activity and emotional state by analyzing one or more sensor signals 220. However, according to various aspects of the disclosure, changes in one or more of an activity and emotional state can be detected using other methods. For example, a change in one or more of an activity and an emotional state can be detected based on unsolicited media information. The change in activity and/or emotional state can then be correlated with one or more sensor signals 220 received before, concurrently with, or after the generation of the unsolicited media information. The correlation can be performed by detecting a time period for the activity and/or emotional state and matching the time period to a time period of the sensor signals 220. As an example, consider Fig. 1 1.
[00811 Fig. 11 shows as simplified flow diagram of another method 1100 that can be implemented by the device 200 of Fig. 2 to analyze changes in the activity and emotional state of the first individual 104 over time. In step 1102, the processor 204 retrieves one or more of unsolicited network-accessible media information 222 and unsolicited locally- stored media information 218. To retrieve unsolicited network-accessible media information 222 such as information stored on a password-protected website, the processor 204 can be configured to log onto the website if needed.
[0082] In step 1104, the processor 204 analyzes the unsolicited information in a manner similar to that discussed above in relation to step 1004 of Fig. 10 (e.g., semantic analysis of text or audio, image analysis for pictures and video, and geolocation). If it is determined in step 1106 that the processor 204 did not detect an activity or emotional state from the unsolicited media information, or if it is determined in step 1108 that processor 204 did not detect a time period for the unsolicited media information, then the processor 204 concludes in step 1112 that it did not detect an observed activity or observed emotional state. In this case, the processor 204 can continue or discontinue monitoring the unsolicited media information in step 1128. If, the processor continues monitoring, then processing returns to step 1102. If, on the other hand, the processor 204 detects one or more of an observed emotional state and observed activity, and the processor 204 determines that the time period matches in step 1108, then the processor 204 proceeds to step 1110.
[0083] In step 1110, the processor 204 determines if the unsolicited media information concerns the first individual 104. For example, the processor 204 can determine if a posting by the first individual 104 or a third party (in the event that third party information is considered) describes an activity and or emotional state of the first individual 104 or a third party. If the observed activity and/or observed emotional state does not concern the first individual 104, then the processor 204 concludes in step 1112 that it has not determined an observed activity and/or emotional state. If, on the other hand, the unsolicited media information concerns the first individual 104, then, in step 1116, the processor 204 attempts to find a time-indexed profile 212 in the memory 206. In particular, the processor 204 searches for a time-indexed profile 212 that stores a time period that matches the time period determined by the processor 204 in step 1104.
[0084] If it is determined in step 1120 that the processor 204 did not identify a matching time-indexed profile 212 in the memory 206, then the processor 204 can create a new time-indexed profile 212 in step 1118 and store the newly-created profile 212 in the memory 206. In this case, the newly-created profile 212 is incomplete and does not include one or more sensor signals 220, an expected activity, or an expected emotional state. The newly-created profile 212 can be completed at a subsequent time period by performing, for example, the method 600 in Fig. 6. [0085] If it is determined in step 1120 that the processor 204 did identify a matching time-indexed profile 212 in the memory 206, then the processor 204 can update the time-indexed profile 212 to store the observed activity and/or observed emotional state in step 1122. The time-indexed profile 212 can then be used in step 1124 to create a new baseline profile 210 if one does not exist for the activity and/or emotional state.
[0086] In step 1126, the processor 204 can present feedback in the form of a notification 224 to the first individual 104. The notification can be presented by at least one notification device as described above in relation to step 616 of Fig. 6. The notification 224 can simply inform the first individual 104 of any of an expected activity, an expected emotional state, an observed activity, and an observed emotional state stored in the newly created or updated time-indexed profile 212. Alternatively, or in addition, the notification 224 can provide more detailed feedback to the first individual 104.
[0087] For example, if one or more of the observed activity and observed emotional state do not match a respective one of the expected activity and expected emotional state (if already stored), then the notification 224 can inform the first individual 104 of the discrepancy. In this case, the notification 224 can also suggest reasons why the observed activity and/or emotional state differed from respective expected activity and emotional state. For example, suppose that the first individual 104 went for a run, and that typically when the first individual goes for a run, the first individual 104 feels good, and consequently, the expected emotional state is happy. Now suppose that the first individual 104 posts on social media that he or she "feels terrible after my run." In this case, the observed emotional state for the run is unhappy. The notification 224 can present differences between the run and a typical run. For example, the notification 224 can inform the first individual 104 that the run was longer than on a typical run, the temperature was colder than on a typical run, the first individual 104 had low energy before the run, the first individual 104 did not sleep well before the run, the first individual 104 ate right before the run, and so on.
100881 It should be understood that steps of the exemplary methods set forth herein are not necessarily required to be performed in the order described, and the order of the steps of such method should be understood to be merely exemplary. Likewise, additional steps may be included in such methods, and certain steps may be omitted or combined, in methods consistent with various aspects of the disclosure.
[0089] It will be appreciated by those skilled in the art that changes could be made to the embodiments described above without departing from the broad inventive concept thereof. Furthermore, it should be appreciated that the structure, features, and methods as described above with respect to any of the embodiments described herein can be incorporated into any of the other embodiments described herein unless otherwise indicated. It is understood, therefore, that this invention is not limited to the particular embodiments disclosed, but it is intended to cover modifications within the spirit and scope of the present disclosure.
100901 The invention can be embodied in the form of methods and apparatuses for practicing those methods. The invention can also be embodied in the form of program code embodied in tangible media, such as magnetic recording media, optical recording media, solid state memory, floppy diskettes, CD-ROMs, hard drives, or any other non-transitory machine- readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention. The invention can also be embodied in the form of program code, for example, stored in a non- transitory machine-readable storage medium including being loaded into and/or executed by a machine, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention. When implemented on a general-purpose processor, the program code segments combine with the processor to provide a unique device that operates analogously to specific logic circuits.
[0091] It will be further understood that various changes in the details, materials, and arrangements of the parts which have been described and illustrated in order to explain the nature of this invention may be made by those skilled in the art without departing from the scope of the invention as expressed in the following claims.
[0092] Although the elements in the following method claims, if any, are recited in a particular sequence with corresponding labeling, unless the claim recitations otherwise imply a particular sequence for implementing some or all of those elements, those elements are not necessarily intended to be limited to being implemented in that particular sequence.

Claims

CLAIMS What is claimed is:
1. A method for analyzing an activity engaged in by an individual and an emotional state of the individual during the activity, the method comprising:
(a) a receiver receiving a sensor signal from a physiological sensor, the sensor signal characterizing a physiological property of the individual during the activity;
(b) retrieving an incomplete physiological profile from memory, the incomplete physiological profile including a characterization of the physiological property of the individual and one of an expected emotional state and an expected activity;
(c) at least one processor identifying a match between the sensor signal and the physiological profile by performing a comparison based on the sensor signal and the
characterization of the physiological property, and predicting one of the emotional state and the activity of the individual from the one of the expected emotional state and the expected activity;
(d) the at least one processor retrieving unsolicited media information about the individual from memory, wherein the unsolicited media information is generated without the at least one processor prompting the individual to characterize either of the emotional state and the activity;
(e) the at least one processor analyzing the unsolicited media information to determine a different one of the emotional state and the activity of the individual from one of an observed emotional state and an observed activity detected from the unsolicited media information; and
(f) storing in memory a complete physiological profile comprising (i) the one of the emotional state and activity predicted in step (c) and (ii) the different one of the emotional state and the activity determined in step (e).
2. The method of claim 1, wherein:
step (a) further comprises the receiver receiving an additional sensor signal from an additional physiological sensor, the additional sensor signal characterizing an additional physiological property of the individual, different from the physiological property;
in step (b), the incomplete physiological profile further includes an additional characterization of the additional physiological property of the individual; and
step (c) further comprises the processor performing an additional comparison based on the additional sensor signal and the additional characterization of the additional physiological property, and predicting the one of the emotional state and the activity of the individual based further on the additional comparison.
3. The method of claim 1, further comprising:
(g) the receiver receiving a subsequent sensor signal from the physiological sensor, the subsequent sensor signal characterizing the physiological property of the individual at a subsequent time period;
(h) retrieving the complete physiological profile from memory, the complete physiological profile including the characterization of the physiological property of the individual and both the stored emotional state and the stored activity; and
(i) the at least one processor identifying a match between the subsequent sensor signal and the complete physiological profile by performing a comparison based on the subsequent sensor signal and the characterization of the physiological property, and predicting the emotional state and the activity of the individual from the complete physiological profile.
4. The method of claim 1, wherein:
step (c) comprises comparing one of raw sensor data and an interpretation of the raw sensor data for the sensor signal to the characterization of the physiological property.
5. The method of claim 1, wherein steps (d) and (e) are performed before steps (a) to (c)
6. The method of claim 1, wherein step (e) further comprises:
(el) the at least one processor analyzing the unsolicited media information to determine a time period for the at least one of the observed emotional state and the observed activity; and
(e2) the at least one processor identifying a match between (i) the time period for the at least one of the observed emotional state and the observed activity and (ii) a time period for the sensor signal.
7. A method for analyzing an activity engaged in by an individual and an emotional state of the individual during the activity, the method comprising:
(a) at least one processor retrieving unsolicited media information about the individual, wherein the unsolicited media information is generated without the at least one processor prompting the individual to characterize either of the emotional state and the activity;
(b) the at least one processor analyzing the unsolicited media information to detect at least one of an observed emotional state and an observed activity, and determine at least one of the emotional state and the activity of the individual from the at least one of the observed emotional state and the observed activity; (c) a receiver receiving a sensor signal from a physiological sensor, the sensor signal characterizing a physiological property of the individual; and
(d) storing in memory a physiological profile comprising (i) a characterization of the physiological property and (ii) the at least one of the activity and emotional state determined in step (b).
8. The method of claim 7, wherein:
step (b) further comprises the at least one processor analyzing the unsolicited media information to determine a time period for the at least one of the observed emotional state and the observed activity; and
the method further comprises the at least one processor identifying a match between the time period for the at least one of the observed emotional state and the observed activity and a time period for the sensor signal.
9. The method of claim 7, wherein step (d) further comprises storing the time period for the sensor signals in the physiological profile in the memory.
10. The method of claim 7, further comprising:
(e) the receiver receiving a subsequent sensor signal from the physiological sensor, the subsequent sensor signal characterizing the physiological property of the individual at a subsequent time period;
(f) retrieving the physiological profile from memory, the physiological profile including the characterization of the physiological property of the individual and the at least one of the activity and emotional state; and
(g) the at least one processor identifying a match between the subsequent sensor signal and the physiological profile by performing a comparison based on the subsequent sensor signal and the characterization of the physiological property, and predicting the at least one of the activity and emotional state of the individual from the physiological profile.
11. The method of claim 10, further comprising:
(h) providing feedback to the first individual based on the at least one of the activity and emotional state predicted in step (g).
12. A method for analyzing an activity engaged in by an individual and an emotional state of the individual during the activity, the method comprising: (a) a receiver receiving a sensor signal from a physiological sensor, the sensor signal characterizing a physiological property of the individual;
(b) retrieving a physiological profile from memory, the physiological profile including a characterization of the physiological property of the individual and at least one of an expected emotional state and an expected activity;
(c) at least one processor identifying a match between the sensor signal and the physiological profile by performing a comparison based on the sensor signal and the characterization of the physiological property, and predicting at least one of the emotional state and the activity from the at least one of the expected emotional state and the expected activity;
(d) the at least one processor retrieving media information about the individual from memory;
(e) the at least one processor analyzing the media information to detect at least one of an observed emotional state and an observed activity of the individual; and
(f) the at least one processor detecting at least one of a similarity and a difference between (i) the at least one of the expected emotional state and the expected activity and (ii) the at least one of the observed emotional state and the observed activity.
13. The method of claim 12, further comprising:
(g) the at least one processor generating a notification to present to the first individual, the notification recommending a course of action to the individual based on the at least one of the similarity and difference detected in step (f).
14. The method of claim 12, wherein:
step (a) further comprises the receiver receiving an additional sensor signal from an additional physiological sensor, the additional sensor signal characterizing an additional physiological property of the individual, different from the physiological property;
in step (b), the physiological profile further includes an additional characterization of the additional physiological property of the individual; and
step (c) further comprises the processor performing an additional comparison based on the additional sensor signal and the additional characterization of the additional physiological property, and predicting the one of the emotional state and the activity of the individual based further on the additional comparison.
15. The method of claim 12, wherein step (e) further comprises: the at least one processor analyzing the unsolicited media information to determine a time period for the at least one of the observed emotional state and the observed activity; and
the at least one processor identifying a match between (i) the time period for the at least one of the observed emotional state and the observed activity and (ii) a time period for the sensor signal.
16. The method of claim 12, wherein:
step (c) comprises comparing one of raw sensor data and an interpretation of the raw sensor data to the characterization of the physiological property.
17. The method of claim 12, wherein:
step (a) further comprises the receiver receiving an additional sensor signal from an additional physiological sensor, the additional sensor signal characterizing an additional physiological property of the individual, different from the physiological property;
step (b) further comprises retrieving an additional physiological profile from memory, the additional physiological profile including an additional characterization of the additional physiological property of the individual and at least one of an additional expected emotional state and an additional expected activity; and
step (c) further comprises:
(cl) the at least one processor identifying a match between the additional sensor signal and the additional physiological profile by performing a comparison based on the additional sensor signal and the additional characterization of the additional physiological property; and
(c2) the at least one processor predicting the at least one of the emotional state and the activity by finding that the at least one of the expected emotional state and the expected activity matches the at least one of the additional expected emotional state and the additional expected activity.
18. A method for analyzing an activity engaged in by an individual and an emotional state of the individual during the activity, the method comprising:
(a) a receiver receiving a sensor signal from a physiological sensor, the sensor signal characterizing a physiological property of the individual at a first time period;
(b) retrieving a physiological profile from memory, the physiological profile including a characterization of the physiological property of the individual and both an expected emotional state and an expected activity; and (c) at least one processor performing a comparison based on the sensor signal and the characterization of the physiological property, and predicting, based on the comparison, the emotional state and the activity of the individual from the expected emotional state and the expected activity.
19. The method of claim 18, further comprising:
(d) the at least one processor retrieving media information about the individual from memory;
(e) the at least one processor analyzing the media information to detect at least one of an observed emotional state and an observed activity of the individual;
(f) the at least one processor detecting at least one of a similarity and a difference between (i) the expected emotional state and the expected activity and (ii) a corresponding one of the at least one of the observed emotional state and the observed activity; and
(g) the at least one processor generating a notification to present to the first individual, the notification recommending a course of action to the individual based on the at least one of the similarity and difference detected in step (f).
20. The method of claim 18, wherein:
step (a) further comprises the receiver receiving an additional sensor signal from an additional physiological sensor, the additional sensor signal characterizing an additional physiological property of the individual, different from the physiological property;
step (b) further comprises retrieving an additional physiological profile from memory, the additional physiological profile including an additional characterization of the additional physiological property of the individual and at least one of an additional expected emotional state and an additional expected activity; and
step (c) further comprises:
(cl) the at least one processor identifying a match between the additional sensor signal and the additional physiological profile by performing a comparison based on the additional sensor signal and the additional characterization of the additional physiological property; and
(c2) the at least one processor predicting the at least one of the emotional state and the activity by finding that the at least one of the additional expected emotional state and the additional expected activity matches a corresponding one of the expected emotional state and the expected activity.
21. The method of claim 18, wherein step (c) comprises comparing one of raw sensor data and an interpretation of the raw sensor data for the sensor signal to the characterization of the physiological property.
PCT/US2015/066573 2014-12-31 2015-12-18 Analyzing emotional state and activity based on unsolicited media information WO2016109246A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462098440P 2014-12-31 2014-12-31
US62/098,440 2014-12-31

Publications (1)

Publication Number Publication Date
WO2016109246A1 true WO2016109246A1 (en) 2016-07-07

Family

ID=55083524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/066573 WO2016109246A1 (en) 2014-12-31 2015-12-18 Analyzing emotional state and activity based on unsolicited media information

Country Status (1)

Country Link
WO (1) WO2016109246A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120130196A1 (en) * 2010-11-24 2012-05-24 Fujitsu Limited Mood Sensor
WO2013088307A1 (en) * 2011-12-16 2013-06-20 Koninklijke Philips Electronics N.V. History log of user's activities and associated emotional states
US20140149177A1 (en) * 2012-11-23 2014-05-29 Ari M. Frank Responding to uncertainty of a user regarding an experience by presenting a prior experience

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120130196A1 (en) * 2010-11-24 2012-05-24 Fujitsu Limited Mood Sensor
WO2013088307A1 (en) * 2011-12-16 2013-06-20 Koninklijke Philips Electronics N.V. History log of user's activities and associated emotional states
US20140149177A1 (en) * 2012-11-23 2014-05-29 Ari M. Frank Responding to uncertainty of a user regarding an experience by presenting a prior experience

Similar Documents

Publication Publication Date Title
US20220110563A1 (en) Dynamic interaction system and method
US9901780B2 (en) Adjusting exercise machine settings based on current work conditions
US20180096738A1 (en) Method for providing health therapeutic interventions to a user
US20210098110A1 (en) Digital Health Wellbeing
JP6671679B2 (en) Health management device and method for performing user health management
US20180144101A1 (en) Identifying diagnosis-relevant health information
US20180056130A1 (en) Providing insights based on health-related information
US9269119B2 (en) Devices and methods for health tracking and providing information for improving health
US20150032670A1 (en) Avatar Having Optimizing Artificial Intelligence for Identifying and Providing Relationship and Wellbeing Recommendations
US20140363797A1 (en) Method for providing wellness-related directives to a user
US20210015415A1 (en) Methods and systems for monitoring user well-being
US20160063874A1 (en) Emotionally intelligent systems
US10559387B2 (en) Sleep monitoring from implicitly collected computer interactions
US20210391083A1 (en) Method for providing health therapeutic interventions to a user
US20140142967A1 (en) Method and system for assessing user engagement during wellness program interaction
WO2016142360A1 (en) Wearable device obtaining audio data for diagnosis
US20150215412A1 (en) Social network service queuing using salience
US9521973B1 (en) Method of monitoring patients with mental and/or behavioral disorders using their personal mobile devices
WO2015091893A1 (en) System and method for topic-related detection of the emotional state of a person
US10732722B1 (en) Detecting emotions from micro-expressive free-form movements
WO2019132772A1 (en) Method and system for monitoring emotions
KR20190103104A (en) User terminal device and system for performing user customized health management, and methods thereof
Roberts et al. Help! Someone is beeping...
US20220036481A1 (en) System and method to integrate emotion data into social network platform and share the emotion data over social network platform
WO2016109246A1 (en) Analyzing emotional state and activity based on unsolicited media information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15823284

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15823284

Country of ref document: EP

Kind code of ref document: A1