WO2016151494A1 - Environment-based pain prediction wearable - Google Patents

Environment-based pain prediction wearable

Info

Publication number
WO2016151494A1
WO2016151494A1 PCT/IB2016/051624 IB2016051624W WO2016151494A1 WO 2016151494 A1 WO2016151494 A1 WO 2016151494A1 IB 2016051624 W IB2016051624 W IB 2016051624W WO 2016151494 A1 WO2016151494 A1 WO 2016151494A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
activity
pain
user
indication
instructions
Prior art date
Application number
PCT/IB2016/051624
Other languages
French (fr)
Inventor
John E. Cronin
Michael G. D'ANDREA
Jonathan T. GOGUEN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • G06F19/30Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
    • G06F19/34Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
    • G06F19/3481Computer-assisted prescription or delivery of treatment by physical action, e.g. surgery or physical exercise
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7275Determining trends in physiological measurement data; Predicting development of a medical condition based on physiological measurements, e.g. determining a risk factor
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications
    • G06F19/30Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
    • G06F19/34Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
    • G06F19/3418Telemedicine, e.g. remote diagnosis, remote control of instruments or remote monitoring of patient carried devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0271Thermal or temperature sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Abstract

Various embodiments described herein relate to a wearable device and associated system of devices for predicting future pain in a wearer based on information such as environmental data and activities performed by the wearer. According to some embodiments, a predictive model is trained based on records that correlate performance of activity in some environment (e.g., weather) conditions with observed pain at a later time. Thereafter, environment forecasts may be used with the model to predict future pain levels and provide recommendations of alternative activities to reduce the pain level experienced at a later time.

Description

ENVIRONMENT -BASED PAIN PREDICTION WEARABLE

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] The present application claims the priority benefit of U.S. provisional application number 62/137,186 filed March 23, 2015 and entitled "Environment-Based Pain Prediction Wearable," the entire disclosure of which is hereby incorporated herein by reference for all purposes.

TECHNICAL FIELD

[0002] Various embodiments described herein relate to wearable devices and more particularly, but not exclusively, to wearables for predicting future pain experience of a wearer.

BACKGROUND

[0003] Wearable technology may include any type of mobile electronic device that can be worn on the body, attached to, or embedded in clothes and accessories of an individual and currently exists in the consumer and medical marketplaces. Processors and sensors associated with the wearable technology can display, process, gather, or elicit information. Such wearable technology may be used in a variety of areas, including monitoring health data of the user as well as other types of data and statistics. These types of devices may be readily available to the public and may be easily purchased by consumers. Examples of some wearable technology in the health arena include FIT BIT, NIKE+ FUEL BAND, and APPLE WATCH devices. SUMMARY

[0004] Many people suffer from chronic pain (e.g., joint pain) that can be exacerbated by the weather, physical activity, and various other factors. It is sometime difficult, however, to take all such factors into account when planning the day's activities, bringing a level of pain that could have been avoided if only different choices were made (e.g., staying indoors, walking instead of running, etc.). Accordingly, it would be beneficial to provide a system that could provide predictions of future pain based on, among other features, the current or forecasted weather. Such a prediction may aid the user in choosing activities or making other choices that may result in lower pain in the future. It would also be desirable if such a system could also provide recommendations of choices that are likely to result in relatively reduced pain.

[0005] Various embodiments described herein relate to a method performed by a processor for predicting pain in a user of a wearable device, the method including:

receiving a first environmental parameter descriptive of an environment of the user, wherein the first environmental parameter is based on data gathered by a sensor of the wearable device; receiving a first activity indication that identifies an activity performed by the user; receiving an indication of a pain level of the user; correlating the first environmental parameter, the first activity indication, and the indication of the pain level as a training record; training a pain prediction model using the training record; receiving a second environmental parameter; receiving a second activity indication; applying the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and presenting the predicted pain level to the user.

[0006] Various embodiments described herein relate to a non-transitory machine- readable medium encoded with instructions for execution by a processor for receiving a first environmental parameter descriptive of an environment of the user, wherein the first environmental parameter is based on data gathered by a sensor of the wearable device; instructions for receiving a first activity indication that identifies an activity performed by the user; instructions for receiving an indication of a pain level of the user; instructions for correlating the first environmental parameter, the first activity indication, and the indication of the pain level as a training record; instructions for training a pain prediction model using the training record; instructions for receiving a second environmental parameter; instructions for receiving a second activity indication;

instructions for applying the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and instructions for presenting the predicted pain level to the user.

[0007] Various embodiments described herein relate to a wearable device for predicting pain in a user of a wearable device, the method including: an environmental sensor configured to sense environment data; an accelerometer configured to sense motion data; a memory; and a processor configured to: determine a first environmental parameter descriptive of an environment of the user based on the environment data, determine a first activity indication that identifies an activity performed by the user based on the motion data; receive an indication of a pain level of the user; correlate the first environmental parameter, the first activity indication, and the indication of the pain level as a training record; train a pain prediction model using the training record; receive a second environmental parameter including at least one of: determining the second environment parameter based on additional environment data from the environmental sensor, and receiving the second environmental parameter as a forecasted parameter from a remote server; determine a second activity indication; apply the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and present the predicted pain level to the user.

[0008] The method, device and the non-transitory machine readable medium as described above provides better prediction of the pain level. Given that the prediction includes environment as one of the factors that affect the user's health directly, many false positives are avoided. For instance, the user for the same heart rate and activity, such as running, may have a different level of pain at home, where the temperature is controlled, and outside in freezing temperatures. Thus, consideration of the environmental factors, such as temperature, in conjunction with the activity performed further improves the prediction and helps the user better to prepare for his day.

[0009] Various embodiments are described wherein the second environmental parameter is a predicted environmental parameter received from a forecasting server.

[0010] Various embodiments are described wherein: the first environmental parameter is descriptive of the environment of the user at a first time, and the indication of the pain level is descriptive of the pain level of the user at a second time that is later than the first time, whereby the correlating step selects the indication for correlation with the first environmental parameter based on a time difference between the first time and the second time.

[0011] Various embodiments additionally include identifying a recommended activity based on application of the pain prediction model to the second environmental parameter and the recommended activity indicating an alternative pain level that is lower than the predicted pain level; and presenting the recommended activity to the user.

[0012] Various embodiments are described wherein the second activity indication is a predicted activity generated by application of an activity prediction model.

[0013] Various embodiments additionally include receiving accelerometer data from an accelerometer of the wearable device; applying an activity model to the accelerometer data to generate the first activity indication.

[0014] Various embodiments are described wherein receiving an indication of a pain level of the user includes: receiving a physiological parameter descriptive of the user, wherein the physiological parameter is based on additional data gathered by an additional sensor of the wearable device; and applying a pain estimation model to the physiological parameter to generate the indication of the pain level. BRIEF DESCRIPTION OF THE DRAWINGS

[0015] In order to better understand various example embodiments, reference is made to the accompanying drawings, wherein:

[0016] FIG. 1 illustrates an example of a system for providing environment- based pain prediction;

[0017] FIG. 2 illustrates an example of a method for training a pain prediction model;

[0018] FIG. 3 illustrates an example of a method for predicting future pain of a user;

[0019] FIG. 4 illustrates an example of an environment for performing pain prediction;

[0020] FIG. 5 illustrates an example of a hardware device for performing pain prediction;

[0021] FIG. 6 illustrates an example of a training set for training a pain prediction model;

[0022] FIG. 7 illustrates an example method for creating an unlabeled training record for training a pain prediction model;

[0023] FIG. 8 illustrates an example of a method for labeling a training record for training a pain prediction model;

[0024] FIG. 9 illustrates an example of a training set for training an activity prediction model;

[0025] FIG. 10 illustrates an example method for creating an unlabeled training record for training an activity prediction model;

[0026] FIG. 11 illustrates an example of a method for labeling a training record for training an activity prediction model; and

[0027] FIG. 12 illustrates an example of a method for training a model. DETAILED DESCRIPTION

[0028] The description and drawings presented herein illustrate various principles. It will be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody these principles and are included within the scope of this disclosure. As used herein, the term, "or," as used herein, refers to a non-exclusive or (i.e., and/or), unless otherwise indicated (e.g., "or else" or "or in the alternative"). Additionally, the various embodiments described herein are not necessarily mutually exclusive and may be combined to produce additional embodiments that incorporate the principles described herein.

[0029] FIG. 1 illustrates an example of a system 100 for providing environment- based pain prediction. The system 100 shown illustrates various functional components and some interactions therebetween. It will be appreciated that such functional components will be implemented using physical hardware and, in some embodiments, software executed by the hardware. Thus, each functional device or engine may be embodied in a dedicated hardware device. Additionally, in some embodiments, two or more of the functional devices of the system 100 may be embodied in a single hardware device. For example, in some embodiments, some embodiments, the sensor devices, environment input device, and activity input device may be embodied in a single wearable device.

[0030] While the system 100 shows some functional devices as being single devices and others as including multiple similar devices, it will be understood that alternative arrangements are possible. For example, an alternative system may use only a single sensor device 105 but may include multiple redundant pain predictors 150 (e.g., with a load balancer, not shown, to distribute requests, user data, or other actionable information evenly therebetween to enable the pain predictor 150 to serve a large number of client devices). [0031] The sensor devices 105 may be devices including virtually any sensor capable of sensing data about a user, the user's environment, the user's context, the state of various electronics associated with the user, etc. In some embodiments, the sensor devices 105 may sense physiological parameters about the user. For example, the sensor devices 105 may include accelerometers, conductance sensors, optical sensors, temperature sensors, microphones, cameras, etc. These or other sensors may be useful for sensing, computing, estimating, or otherwise acquiring physiological parameters descriptive of the wearer such as, for example, steps taken, walkmg/running distance, standing hours, heart rate, respiratory rate, blood pressure, stress level, body

temperature, calories burned, resting energy expenditure, active energy expenditure, height, weight, sleep metrics, etc. Various environmental sensors may include, for example, temperature, humidity, or barometric pressure sensors.

[0032] While many useful parameters may be obtained directly from sensor devices 105, other useful parameters are obtained by "extracting" them from other available data (including the sensor data). For example, raw accelerometer data may be processed to extract a number of steps taken, an estimate of calories burned, or an estimate of an activity being performed by the user (e.g., running, playing tennis, biking, etc.) Accordingly, various devices in the system may implement parameter extraction algorithms for processing available parameters (e.g., sensed parameters and other extracted parameters) to generate new parameters. In some embodiments, the sensor devices 110 may implement such algorithms for parameter extraction in the same device as at least some of the sensors that obtain parameters upon which the algorithms depend or for parameter extraction that is local to the user though not necessarily in the same device as the predicate sensors. For example, in some embodiments, a wearable device may report accelerometer data to a user's mobile device, which then applies an algorithm to estimate a number of steps taken using the accelerometer data. Additionally or alternatively, virtually any device may apply such parameter extraction algorithms. [0033] A pain input device 110 may be any device capable of creating an indication of pain experienced by a user such as, for example, a wearable device, mobile phone, tablet, computer, server, or virtual machine deployed in a cloud computing environment. For example, in some embodiments, the pain input device 110 may include an interface on a device that enables the user to manually input an indication of their pain level (e.g., on a scale of 1 to 10). In some embodiments, the pain input device 110 may apply a pain estimation model 115 to various parameters of the user (e.g., physiological parameters such as heart rate, blood pressure, and respiratory rate provided by one or more sensor devices 105) to estimate a pain level currently being experienced. The pain estimation model 115 may be created in a manner similar to those approaches described herein with respect to the pain prediction model such as, for example, application of a machine learning approach (e.g., linear regression, neural networks, etc.). In some embodiments, the pain estimation model 115 may be trained based on data records of physiological parameters labeled according to manually input pain levels.

[0034] An activity input device 120 may be any device capable of creating an activity indication identifying an activity being performed or likely to be performed by a user. Such a device may include, for example, a wearable device, mobile phone, tablet, computer, server, or virtual machine. For example, in some embodiments, the activity input device 120 may include an interface on a device that enables the user to manually input an indication of an activity being performed presently or to be performed in the future. In some embodiments, the activity input device 120 may access other data owned by the user such as, for example, calendar data or email data, to infer a current or future activity.

[0035] In some embodiments, the activity input device 120 includes a current activity model 123 to be applied to parameters of the user (e.g., accelerometer data received from the sensor devices 105) to identify a current activity (e.g., walking, running, cycling, playing basketball, swimming, etc.). The current activity model 123 may be created in a manner similar to those approaches described herein with respect to the pain prediction model such as, for example, application of a machine learning approach (e.g., linear regression, neural networks, etc.). In some embodiments, the current activity model 123 may be trained based on data records of physiological parameters labeled according to manually input activity indications.

[0036] In some embodiments, the activity input device 120 may alternatively or additionally include an activity prediction model 126 for application to parameters of the user (e.g. accelerometer data, activities identified by the current activity model 123, other physiological parameters, time of day, etc.) to predict one or more activities that the user is likely to engage in in the future (e.g., in the next 10 minutes, in the next 4 hours, etc.). For example, where a user has been observed to tie their shoes at 5am immediately before running, the activity prediction model 126 may predict that the user will likely go running when the current activity model 123 indicates that the user is tying their shoes and the current time is 5am. The activity prediction model 126 may be created in a manner similar to those approaches described herein with respect to the pain prediction model such as, for example, application of a machine learning approach (e.g., linear regression, neural networks, etc.). In some embodiments, the activity prediction model 126 may be trained based on data records of available parameters labeled according to manually input activity indications. Alternatively, the records may be labeled according to pain levels identified by the pain estimation model 115. An example of an approach to training the activity prediction model 126 will be described in greater detail below with respect to FIGS. 9-11.

[0037] An environment input device 130 may be any device capable of creating one or more environmental parameters descriptive of a current or forecast

environmental state (e.g., weather conditions). Such a device may include, for example, a wearable device, mobile phone, tablet, computer, server, or virtual machine. In some embodiments, a sensor device 105 may provide usable environmental parameters as raw sensed data (e.g., instantaneous temperature readings); as such, the sensor device 105 may also be considered to constitute an environment input device 130. In some embodiments, the environment input device may include an interface on a device that enables the user to manually input an indication of current or future environmental conditions.

[0038] In some embodiments, the environment input device 130 includes a current environment model 133 to be applied to parameters of the environment (e.g., raw sensor data) to identify a current environment parameter (e.g., temperature, pressure, humidity, or a generalized label describing the state of the environment such as 'hot and humid', etc.). The current environment model 133 may be created in a manner similar to those approaches described herein with respect to the pain prediction model such as, for example, application of a machine learning approach (e.g., linear regression, neural networks, etc.). In some embodiments, the current activity model 123 may be trained based on data records of environmental parameters or sensor data labeled according to manually input environment indications.

[0039] In some embodiments, the environment input device 130 may include an environment forecaster 136 or other component for providing indications of predicted environmental conditions (e.g., a weather forecast). For example, the environment input device 130 may be a weather service server. Thus, as with other components of the system 100, the environment input device 130 may constitute multiple devices such as, for example, a wearable device implementing the current environment model 133 and a remote server implementing the environment forecaster 136.

[0040] The training database constructor 140 may be a device (e.g., a wearable device, mobile phone, tablet, computer, server, or virtual machine) that constantly, periodically, on request, or at other times receive data from the pain input device 110, activity input device 120, environment input device 130, directly from the sensor devices 105, or from other devices. Using this information, the training database constructor may construct multiple training records for use by the model trainer 145 to train a pain prediction model. For example, each set of activity indications and environmental parameters that occur at (or, in some embodiments, around) the same time may be correlated with a pain indications such as a received pain indication that occurred at a later time (e.g., l or 6 hours in the future) or an aggregation of multiple received pain indications (e.g., an average, maximum, or 75th percentile pain level in the six hours following the activity and environmental parameter capture). As will be appreciated, this time shifting within the training records (e.g., by correlating activity and

environmental information with pain observed at a later time), techniques such as regression may more easily capture the effects of these features on later pain experience, rather than present pain experience. In other embodiments, features (in this case, activity and environmental data) may not be stored in the same record as the label (in this case, pain levels); instead, such information may be correlated by additional values such as, for example, time stamps.

[0041] Upon deployment of the system 100 for a new user, the training database constructor may begin with a baseline training set (or the pain prediction model may already be trained according to such a baseline training set) obtained in a lab setting or from a large test group of participants. As the new user participates in the system, the training database constructor 140 may gradually personalize the training set to the particular user by capturing new data. In some embodiments, the training database constructor may gradually or, once the personalized database is sufficiently large, at once remove the baseline, non-personalized training records from the database. In some embodiments, the training database constructor 140 may also account for changing health conditions of a user. For example, if a training database includes training records from when a user was injured and experiencing higher levels of pain, these records may not be relevant when the user recovers from the injury. To account for this pain dynamic over time, the training database constructor 140 may also clean up older entries over time. For example, each time the training database constructor 140 adds a new record it may also delete the oldest record from the database. Alternatively, the training database constructor 140 may delete any record older than a defined age (e.g., 3 months). [0042] The model trainer 145 may be a device (e.g., a wearable device, mobile phone, tablet, computer, server, or virtual machine) that periodically or upon instruction (e.g., by the training database constructor 140) retrains the pain prediction model using the training database constructed by the training database constructor 140. Various training approaches such as, for example, gradient descent may be appropriate depending on the type of model used for implementing the pain prediction model.

[0043] The pain predictor 150 may be a device (e.g., a wearable device, mobile phone, tablet, computer, server, or virtual machine) that applies the trained pain prediction model to environmental parameters received from the environment input device 130 and activity indications received from the activity input device 120. For example, upon receiving an indication that the weather forecast calls for low pressure and that the user plans to go running in the near future, the pain predictor 150 may apply the pain prediction model to predict that at some point, the user will experience a pain level of 8. It will be apparent that in some alternative embodiments, the pain prediction model may take other features into account, may not utilize activity indications as features, or may not use environmental parameters as features.

[0044] The warning generator 155 may be a device (e.g., a wearable device, mobile phone, tablet, computer, server, or virtual machine) that determines, based on the output of the pain predictor 150, whether to warn the user of the predicted pain. For example, the warning generator may compare the predicted pain level to a threshold pain level input by the user via a user interface of the configuration device 160 (which also may constitute e.g., a wearable device, mobile phone, tablet, computer, or other device accessible to the user) to determine whether an alert should be displayed via the output device 165. For example, the warning may be an audible sound, an image, or text indicating a predicted pain level. Additionally, the warning may display the activity or environmental parameters that influenced the prediction.

[0045] The output device 165 may be any device for delivering information to the user such as, for example, a wearable device, mobile phone, tablet, computer, or other device accessible to the user. Such information may be communicated in virtually any form such as visually (e.g., text, image, or video), audibly (e.g., sound, voice, or music), or haptically (e.g., vibration or tapping). The output device 165 may receive instructions to output such information from other devices (e.g., the warning generator 155 or activity recommendation engine 170) via virtually any channel such as, for example, (where these devices are physically separate) application-specific TCP/IP communications between the devices, email, SMS, websites, etc.

[0046] The activity recommendation engine 170 may be a device (e.g., a wearable device, mobile phone, tablet, computer, server, or virtual machine) that identifies possible alternative activities for recommendation to the user via the output device 165. For example, the activity recommendation engine 170 may select one or more candidate activities (e.g., from a group of activities in which the user has previously been observed to engage or from a group of activities that are similar to the current or planned activity) and query the pain predictor 150 for a predicted pain level given current or forecast environmental parameters and the candidate activity in place of the current or predicted activity. The activity recommendation engine 170 may then use the predicted pain levels to potentially identify on or more activities to recommend to the user via the output device 165 as an alternative leading to a lower pain level. For example, the activity recommendation engine 170 may identify any activities with a predicted pain level that is lower than a pain level that triggered a warning in the warning generator 155 or the activity or may identify the activity that leads to the lowest predicted pain level.

[0047] FIG. 2 illustrates an example of a method 200 for training a pain prediction model. The method 200 may be performed by the various components of the system such as, for example, the pain input device 110, activity input device 120, environment input device 130, training database constructor 140, and model trainer 145. Thus, the method 200 may be performed entirely by one physical device or may be distributed between the operations of two or more physical devices. For sake of simplicity, the operation of the method 200 will be described in the context of being performed by a single device.

[0048] The method begins in step 205 and proceeds to step 210 where the device receives one or more environmental parameters such as, for example, ambient temperature, pressure, and humidity. The device may receive such environmental parameters from a physically separate sensor device or environment input device or may receive such parameters from a sensor device or environment input that is physically integrated into the device.

[0049] In step 220, the device receives an activity indication 220 that indicates an activity that is being performed or will be performed by the user. For example, the activity indication may be manually input by the user or may be inferred from email or schedule data. In some embodiments, the device may receive accelerometer data 223 (e.g., data from an accelerometer worn around the wrist, ankle, waist, etc. of the user) and apply an activity model 226 to the accelerometer data (e.g., by extracting one or more features from the accelerometer data such as periodicity or magnitude to be used as input into the model) to identify one or more activities in which the user is likely to be engaged. For example, the model may be a logistic regression model that has been trained on population data or user-specific data using gradient descent. Various alternative training approaches for discerning activities from accelerometer data will be apparent.

[0050] Nest, in step 230, the device receives an indication of pain experienced by the user such as, for example, a pain level that has been manually-identified by the user. In some embodiments, the device may be capable of automatically determining a pain level experienced by a user. For example, the device may receive physiological parameters 233 (e.g., heart rate, heart rate variance, respiration rate, galvanic skin resistance, short-term changes in complexion, etc.) from one or more sensor devices and apply a trained pain estimation model 236 to these parameters to create an estimated pain level. For example, the model may be a linear regression model that has been trained on population data or user-specific data using gradient descent. Various alternative training approaches for discerning activities from accelerometer data will be apparent.

[0051] It will be apparent that steps 210, 220, 230 may be performed in any order and, in some embodiments, an appreciable amount of time may elapse between performance of these steps 210, 220, 230. For example, step 230 may be performed seconds, minutes, or even hours after performance of step 210, 220 (e.g., because pain resulting from the environment and activity performed may be so-delayed).

[0052] Having gathered data in steps 210-230, the device then creates one or more training records from this data in step 240 to be added to the training database. In some embodiments, this may include a step 245 of correlating the environmental parameters and activity indication from a first time (e.g., measured or otherwise received within a few seconds or minutes of each other) with an indication of a pain level from a second time that is later than the first time (e.g., minutes or hours later). Once the training database has been updated, the device proceeds to train (or retrain) the pain prediction model (e.g., applying gradient descent using the training database) in step 250. Thereafter, the pain prediction model may be used (e.g., by method 300) to predict future pain of the user. The method then proceeds to end in step 255.

[0053] FIG. 3 illustrates an example of a method 300 for predicting future pain of a user. The method 200 may be performed by the various components of the system such as, for example, the pain input device 110, activity input device 120, environment input device 130, pain predictor 150, warning generator 155, and activity recommendation engine 170. Thus, the method 300 may be performed entirely by one physical device or may be distributed between the operations of two or more physical devices. For sake of simplicity, the operation of the method 300 will be described in the context of being performed by a single device.

[0054] The method 300 begins in step 305 and proceeds to step 310, where the device receives one or more environmental parameters. In some embodiments, these environmental parameters are the same type of environmental parameters obtained in step 210 of method 200 and may be obtained from the same or a different source. For example, while step 210 may obtain a currently-measured barometric pressure from a wearable device of the user, step 310 may obtain a predicted barometric pressure for the future (e.g., hours in the future) from a weather service server.

[0055] In step 320, the device receives one or more activity indications indicating an activity in which the user is currently or predicted to be engaged. In some

embodiments, these activity indications may belong to the same taxonomy used for the activity indication received in step 220 of method 200. This activity indication may be determined based on a current activity model (e.g., in a manner similar to that described with respect to step 226 of method 200), manually entered, extracted from calendar or email data, or predicted by an activity prediction model 325. For example, an activity prediction model may be trained (e.g., using a logistic regression model) based on user habits and accelerometer data to predict one or more activities in which the user is likely to engage in the future (e.g., in the coming hours).

[0056] In step 330, the device applies the trained pain prediction model (e.g., as trained in step 250 of method 200) to the environmental parameters and activity indication to predict a future pain level and presents this pain level to the user (e.g., by transmitting an instruction to an output device or, where the output device is in the device executing method 300, directly outputting the indication) in step 360. In some embodiments, step 360 may only be executed if it is determined that the predicted pain level exceeds a threshold (not shown).

[0057] In embodiments that also make activity predictions, the device may, in step 340, select one or more alternative activities to evaluate. For example, the device may select activities in which the user has expressed interest or in which the user has previously been observed to engage. As another example, the device may select activities that are similar (e.g., similar duration or similar energy expenditure) to the activity indicated in step 320. Next, in step 350, the device applies the pain prediction model to the environmental parameters received in step 310 and the activities identified in step 340. If any of the resultant pain predictions are less than the pain predicted in step 330 (or less than an alarm threshold) as determined in step 370, the device will present these alternatives as recommendations in step 380 (again, either by instructing a separate output device or outputting the information directly). In some embodiments, steps 360, 380 may present these respective types of information as part of a single interface or other communication. The method 300 then proceeds to end in step 390.

[0058] FIG. 4 illustrates an example of an environment 400 for performing pain prediction. The environment 400 may include an example implementation of the example system 100 described above. For example, the wearable device 420 may correspond to the sensor devices 105, an environment input device 130, and an output device 165; the mobile device 430 may correspond to the pain input device 110, configuration device 160, and output device 165; the environment server 440 may correspond to an environment input device 130; the activity prediction server 450 may correspond to the activity input device 120; and the pain prediction server 460 may correspond to the training database constructor 140, model trainer 145, pain predictor 150, warning generator 155, and activity recommendation engine 170. Various alternative sets of devices and functional correspondences with the system 100 will be apparent. For example, in some embodiments, all elements of the system may be implemented in a single wearable device 420 or divided between a wearable device 420 and a mobile device 430.

[0059] While various software elements (e.g., algorithms and databases) are depicted as being part of various devices generally, it will be understood that such software elements will be embodied in physical hardware devices and that any functionality ascribed to those software elements will actually be performed by such hardware. While not illustrated, it will be apparent that the various devices 420, 430, 440, 450, 460 include hardware, such as one or more processors each, to carry out the functionalities described herein. As used herein, the term "processor" will be understood to encompass various hardware devices such as, for example, microprocessors, field- programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), and other hardware devices capable of performing the various functions described herein. Further, the devices 420, 430, 440, 450, 460 may include memory devices such as

L1/L2/L3 cache, system memory, or storage devices. As used herein, the term "non- transitory machine-readable storage medium" will be understood to refer to both volatile memory (e.g., SRAM and DRAM) and non- volatile memory (e.g., flash, magnetic, and optical memory) devices, but to exclude mere transitory signals. While various embodiments may be described herein with respect to software or instructions

"performing" various functions, it will be understood that such functions will actually be performed by hardware devices such as a processor executing the software or instructions in question. In some embodiments, such as embodiments utilizing one or more ASICs, various functions described herein may be hardwired into the hardware operation; in such embodiments, the software or instructions corresponding to such functionality may be omitted.

[0060] These devices 420-460 may communicate via a data network 410 such as, for example, a LAN, carrier network, data center network, or the Internet. In some embodiments devices may communicate directly with each other via a wired or wireless connection (e.g., the mobile device 410 and wearable device 420 may communicate via NFC or Bluetooth).

[0061] The wearable device 420 may be any device that is worn on the body of a user. For example, the wearable device 420 may take the form of a wristwatch, belt, pendant necklace, adhesive patch, garment, etc. In this example embodiment, the wearable device 420 includes both physiological sensors and environment sensor, such as those examples described above. The wearable device 420 also includes an alert interface for outputting alerts (e.g., as instructed by the pain prediction server 460) to the user. For example, the alert interface 426 may include a display, speaker, or haptic engine. In some embodiments, upon receiving an alert that a pain prediction has surpassed a threshold, the alert interface of a wristwatch wearable 420 may vibrate and play a notification sound, signifying the receipt of an alert and that the user should check the mobile device 430 for further information.

[0062] The mobile device 430 may be, for example, a mobile phone or tablet carried by the user and configured to operate in conjunction with the wearable device 420 to perform various functions such as, for example, serving as a fuller user interface to the overall system 400 when the wearable device 420 includes no (or limited) user interface elements. As shown, the mobile device 430 includes a configuration interface 432 for allowing the user to configure the system for use. For example, the configuration interface 432 (which may be a graphical user interface) may enable the user to input a threshold pain level beyond which pain predictions will generate alarms. The

configuration interface 432 may also enable the user to identify activities in which the user is interested in engaging for use by the warning and recommendation algorithm 466. These and other configurations may be stored in the user configurations storage 467 of the pain prediction server 460.

[0063] The pain input interface 434 may be an interface (e.g., a GUI) for allowing the user to enter a manual indication of a pain level (e.g., on a scale of 1 to 100) for use by the system. In some embodiments, the user may access the pain input interface 434 when desired (e.g., when experiencing pain) or the pain input interface 434 may pop up or otherwise appear to request input at system-determine times such as periodically (e.g., hourly) or in response to determination that a significant event has occurred (e.g., 4 hours after beginning an activity or after a significant weather change as identified by monitoring reported environmental parameters). The manually-entered pain level may be used to label training records for training the pain prediction model 461.

[0064] The recommendation interface 436 may be an interface (e.g., a GUI) for delivering activity recommendations to the user. Such activity recommendations may be delivered by the warning & recommendation algorithm 466 to the mobile device 430, for example, periodically, in response to a predicted pain triggering an alarm, or upon manual request by the user via the recommendation interface. Upon receiving recommendations, the recommendation interface 436 may, in some embodiments, display a list of recommended activities along with associated predicted pain levels associated therewith, respectively.

[0065] The environment server 440 may be a server (or other device, such as a virtual machine) that provides environmental parameters to other devices in the system. For example, in some embodiments, the environment server 440 may be operated by a weather service and may provide forecast (i.e., predicted) weather parameters such as temperature and barometric pressure for various points in the future. As such, the environment server may implement an application programmer's interface (API) for allowing other devices within the system 100 to retrieve or otherwise receive such parameters.

[0066] The pain prediction server 460 includes a pain prediction model 461 for application by the pain predictor 465 to predict a future level of pain that will be experienced by a user based on features obtained from the other devices in the system 400, such as activity indications and environment parameters. In various embodiments, such as those employing a regression model, the pain prediction model 461 may store multiple coefficients and other values for instantiating a mathematical formula to generate such predictions, as will be described in greater detail below with respect to FIG. 12. A training database constructor 464 obtains similar features and pain measures from the devices in the system 400 to create and, in some embodiments, curate a training database 463 which is, in turn, used by the learning algorithm 462 to train the pain prediction model 461 (e.g., in regression embodiments, to generate the coefficients to be used in the mathematical formula). A warning & recommendation algorithm 466 (which in some embodiments may be embodied as multiple separate algorithms) determines whether a predicted pain level should generate an alarm (e.g., as defined by a threshold in the user configurations 467) and to determine whether any alternative activities can be recommended that lead to a lower predicted pain level. [0067] As noted above, in some embodiments, a future activity may be predicted for a user, which may serve as a feature input into the pain prediction model. By using a predicted activity rather than a current activity, the future pain level may be predicted before the user begins engaging in the activity, such that the user may be given time to change their plans to avoid pain. As shown in the illustrated activity prediction server 450 (which in some embodiments may actually be housed in the same physical hardware and as part of the same server as the pain prediction server 460), one or more activity predictions models 451 are trained by a learning algorithm 453 using an assembled training database 455. This process may be, in some respects, similar to that described above with respect to the pain prediction model.

[0068] In some embodiments, such as those using logistic regression, the activity predictions models 451 may include a separate model for each individual activity in which the user might engage. For example, the models 451 may include a walking prediction model, running prediction model, basketball prediction model, etc. In applying each such model, the activity predictor 459 may receive multiple features as inputs (e.g., features extracted from raw accelerometer data, identifications of current activities, date and time information, weather and other environment information, email and calendar data such as extracted keywords, etc.), process these values in conjunction with the trained model, and output a probability (e.g., on a scale of 0 to 1) of whether the user is likely to engage in the specific activity in the near future (e.g., in the next few minutes or the next few hours). Where the activity prediction model 451 uses a current activity as an input (e.g., tying shoes may indicate the user will go running soon), the activity prediction server may also include a current activity identifier 457 which may apply similarly trained models (not shown) to features (e.g., features extracted from raw accelerometer data) to identify likely activities in which the user is currently engaged.

[0069] FIG. 5 illustrates an example of a hardware device 500 for performing pain prediction, such as the various functional devices of the system 100 of FIG. 1. As shown, the device 500 includes a processor 520, cache/system memory 530, user interface 540, communication interface 550, and storage 560, and for some devices, sensors 545 interconnected via one or more system buses 510. It will be understood that FIG. 5 constitutes, in some respects, an abstraction and that the actual organization of the components of the device 500 may be more complex than illustrated.

[0070] The processor 520 may be any hardware device capable of executing instructions stored in memory 530 or storage 560 or otherwise processing data. As such, the processor may include a microprocessor, field programmable gate array (FPGA), application-specific integrated circuit (ASIC), or other similar devices. In some embodiments, such as those relying on one or more ASICs, the functionality described as being provided in part via software may instead be hardwired into the operation of the ASICs and, as such, the associated software may be omitted.

[0071] The cache/ system memory 530 may include various memories such as, for example LI, L2, or L3 cache or system memory. As such, the memory 530 may include static random access memory (SRAM), dynamic RAM (DRAM), flash memory, read only memory (ROM), or other similar memory devices.

[0072] The user interface 540 may include one or more devices for enabling communication with a user such as an administrator. For example, the user interface 540 may include a display, a mouse, a keyboard, a touchscreen, buttons, camera, microphone, vibrator, haptic engine, etc. In some embodiments, the user interface 540 may include a command line interface or graphical user interface that may be presented to a remote terminal via the communication interface 550.

[0073] The sensors 545 may be virtually any device capable of sensing

parameters from the user, the environment, etc. as described herein such as, for example, accelerometers, gyroscopes, conductance sensors, optical sensors, temperature sensors, microphones, cameras, etc. In some embodiments, the sensors 545 may share hardware in common with the user interface 540.

[0074] The communication interface 550 may include one or more devices for enabling communication with other hardware devices. For example, the network interface 550 may include a network interface card (NIC) configured to communicate according to the Ethernet protocol. Additionally, the communication interface 550 may implement a TCP/IP stack for communication according to the TCP/IP protocols. Various alternative or additional hardware or configurations for the communication interface 550 will be apparent. In some embodiments, the communication interface 550 may include an NFC, Bluetooth, or other short range wireless interface. Various alternative or additional hardware or configurations for the communication interface 550 will be apparent.

[0075] The storage 560 may include one or more machine-readable storage media such as read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, or similar storage media. In various embodiments, the storage 560 may store instructions for execution by the processor 520 or data upon with the processor 520 may operate. For example, the storage 560 may store an operating system 561 for controlling various basic operations of the hardware 500. As shown, the storage may include various additional sets of instructions for performing the various functions described above; while the storage is shown as including all such instructions (e.g., a single-device embodiments of the full system 100 or alternative thereto), it will be apparent which instructions may be omitted and included in other embodiments wherein the device 500 implements fewer than all of the devices of the system 100 of FIG. 1.

[0076] Environment determination instructions 562 may interpret sensor data to determine one or more environmental parameters currently descriptive of the environment while the environment forecast instructions 563 may predict one or more future values for the same or otherwise comparable environmental parameters. The pain prediction instructions may, based on inputs such as activity types and environmental parameters, predict a future pain level to be experienced by the user. For example, the pain prediction instructions 564 may implement the training database constructor 140, model trainer 145, and pain predictor of FIG. 1. The warning generation instructions 565 may determine whether to issue an alert based on a predicted pain level (e.g., by comparing the prediction to one or more configured thresholds 570) while the activity recommendation instructions 566 may invoke the pain prediction instructions 564 with respect to alternative activities to identify lower-pain alternatives to recommend to the user, e.g., when a warning is generated or when the user requests such recommendations. Activity identifications instructions 567 may determine (e.g., based on user accelerometer data) one or more activities in which the user is currently engaged. As such, the activity identification instructions 567 may include instructions for constructing an appropriate training database, training one or more models (e.g., logistic regression models), or applying such models to received data. In a similar manner,, the activity prediction instructions 568 may receive input features and predict future activities in which the user intends to engage. As such, the activity prediction instructions 568 may correspond to the learning algorithm 453, activity prediction models 451, and activity predictor 459 of FIG. 4. Pain estimation instructions 569 may receive input features such as physiological parameters and estimate a current pain level (e.g., on a scale of l-to-10) currently experienced by a user. As such. The pain estimation instructions 569 may include instructions for constructing a pain estimation training database, for training a model (e.g., a linear regression model), or for applying such a model to received features to output a pain level indication.

[0077] It will be apparent that various information described as stored in the storage 560 may be additionally or alternatively stored in the memory 530. In this respect, the memory 530 may also be considered to constitute a "storage device" and the storage 560 may be considered a "memory." Various other arrangements will be apparent.

Further, the memory 530 and storage 560 may both be considered to be "non-transitory machine-readable media." As used herein, the term "non-transitory" will be understood to exclude transitory signals but to include all forms of storage, including both volatile and non-volatile memories. [0078] While the device 500 is shown as including one of each described component, the various components may be duplicated in various embodiments. For example, the processor 520 may include multiple microprocessors that are configured to independently execute the methods described herein or are configured to perform steps or subroutines of the methods described herein such that the multiple processors cooperate to achieve the functionality described herein. Further, where the device 500 is implemented in a cloud computing system, the various hardware components may belong to separate physical systems. For example, the processor 520 may include a first processor in a first server and a second processor in a second server.

[0079] FIG. 6 illustrates an example of a training set 600 for training a pain prediction model. In various embodiments, the training set may be an example of the contents of the training database 463 of the pain prediction server 460 or a similar training database for training a pain prediction model. As shown, each record includes a timestamp field 610 for indicating a time at which one or more of the associated values was captured (e.g., the environment or activity parameters). A pressure field 620 and temperature field 630 store a set of environmental parameters: a barometric pressure and an ambient temperature, respectively. These values may reflect actual measured parameters (or, in some embodiments, previously forecast parameters) at or near the time of the time stamp. An activity field 630 stores an indication of one or more activities in which the user was engaged at or around the time of the timestamp. The values(s) in the activity field 630 may, in some embodiments, belong to an activity taxonomy that provides a finite list of activities that may be identified by the system. Finally, a pain index field 650 stores a label such as an identification of a pain level at a point in the future with respect to the time stamp (e.g., 6 hours in the future), a maximum pain level in a time window after the time stamp (e.g., 4 to 8 hours after), or a composite of multiple reported pain levels. It will be apparent that the training set may include additional or alternative fields, depending on the model to be trained. According to some embodiments, a field may be provided for each feature the pain prediction model accepts as an input.

[0080] FIG. 7 illustrates an example method 700 for creating an unlabeled training record for training a pain prediction model. For example, the method 700 may be performed by the training database constructor 140 or pain prediction instructions 564 to create an unlabeled training set for later labeling and use to train a pain prediction model. The method 700 may be performed periodically (e.g., on a schedule) or in response to receiving new features such as environmental parameters or activity indications.

[0081] The method 700 begins in step 705 and proceeds to step 710 where the training database constructor creates a new timestamped record. Next, in step 715, the training database constructor gathers all available environment parameters currently descriptive of the user's environment (e.g., by polling a wearable device or extracting them from a message already received from a wearable device). The training database constructor then determines in step 720 whether the user is engaged in an activity in step 720 by, for example by applying an activity determination model to accelerometer data and determining whether the output indicates a sufficient degree of confidence (e.g., a value over 0.5) that the user is engaged in one of the activities existing in the activity taxonomy. If so, the activity is added to the new record as well in step 725. Finally, the training database constructor stores the unlabeled training record with the rest of the training database (or, in some embodiments, with other unlabeled records prior to being merged into the larger training database) in step 730 and the method proceeds to end in step 735.

[0082] FIG. 8 illustrates an example of a method 800 for labeling a training record for training a pain prediction model. For example, the method 800 may be performed by the training database constructor 140 or pain prediction instructions 564 to label a training set use to train a pain prediction model. The method 800 may be performed periodically (e.g., on a schedule) or in response to receiving a report of a pain level (e.g., a pain level that was manually entered or determined by application of a pain estimation model). As described, the method 800 creates a composite by averaging pain levels over a time window following an activity and environment instance; alternative methods for alternative pain level labels will be apparent.

[0083] The method 800 begins in step 805 (e.g., upon receiving a pain level report) and proceeds to step 810 where the training database constructor stores the

timestamped pain report (e.g., a simple correlation of the reported pain level with a timestamp of the receipt or pain level determination) with other previously received pain reports. In step 815, the training database constructor obtains a training record (e.g., an unlabeled training record created by application of the method 700 or a labeled training record most recently modified by a previous application of the method 800) having a timestamp falling within a leading window of the current pain report. For example, the training database constructor may define a window between 4 and 8 hours prior to the pain report and locate a training record having a timestamp falling within that window. Next, in step 820, the training database constructor obtains all stored pain reports that follow the current training record (or fall within a separate window such as newer than the current training record by at least 4 hours) and, in step 825, computes an average pain level using the obtained pain reports. In step 830, the average pain level is added to the training record as the pain index label (in some cases, overwriting the previous label). The training database constructor then determines whether this is the last training record in the window. If not and additional training records remain to be processed, the method 800 loops back to step 815 to process the next training record. Otherwise, the method proceeds to end in step 840. Having labeled (or relabeled) some records of the training set, the model trainer may then proceed to retrain the pain prediction model using the new information.

[0084] FIG. 9 illustrates an example of a training set 900 for training an activity prediction model. In various embodiments, the training set may be an example of the contents of the training database 455 of the activity prediction server 450 or a similar training database for training an activity prediction model. As shown, the training set includes a timestamp field for indicating a time at which one or more of the associated values was captured. A physiological data field 920 (which may actually constitute multiple subfields) stores physiological data for use as features to be used as inputs into an activity prediction model. As shown, two features are included: raw accelerometer data and heart rate. It will be appreciated that various additional or alternative physiological data or other features may be included such as, for example, features extracted from the raw accelerometer data, blood pressure, heart rate variance, respiratory rate, time of day, day of week, or keywords extracted from the user's calendar or email. An identified future activities field 930 stores one or more labels such as activities that were subsequently identified as being performed (e.g., by application of an activity determination model or manual input). In various embodiments, the activities may be identified according to the same taxonomy described above.

[0085] FIG. 10 illustrates an example method 1000 for creating an unlabeled training record for training an activity prediction model. For example, the method 1000 may be performed by the activity prediction instructions 568 to create an unlabeled training set for later labeling and use to train an activity prediction model. The method 1000 may be performed periodically (e.g., on a schedule) or in response to receiving new features such as physiological parameters or activity indications.

[0086] The method begins in step 1010 and proceeds to step 1020 where the device creates a new timestamped record. Next, in step 1030, the device gathers any current physiological parameters (or other features to be used as input to the activity prediction model) (e.g., by polling a wearable device or extracting them from a message already received from a wearable device) and adds them to the timestamped record, which is then stored for future use in step 1040. The method 1000 then proceeds to end in step 1050.

[0087] FIG. 11 illustrates an example of a method 1100 for labeling a training record for training an activity prediction model. For example, the method 1100 may be performed by the activity prediction instructions 568 to label a training set for use to train an activity prediction model. The method 1100 may be performed periodically (e.g., on a schedule) or in response to receiving an activity identification (e.g., an activity indication that was manually entered or determined by application of an activity determination model). As described, the method 1200 creates a list of activities to serve as potentially multiple labels for each record; alternative methods for alternative pain level labels will be apparent.

[0088] The method 1100 begins in step 1110 (e.g., upon receiving an activity identification) and proceeds to step 1120 where the device obtains a training record (which may be an unlabeled record created by method 1000 or a labeled record most recently processed by a previous execution of the method 1100) that falls within a window defined based on the current time or the time of the activity indication (e.g., 10 minutes before the current time). In step 1130, the device adds the activity to the training record as a new label. The device then determines, in step 1140, whether this is the last training record in the window. If not and additional training records remain to be processed, the method 1100 loops back to step 1120 to process the next training record. Otherwise, the method proceeds to end in step 1150. Having labeled (or relabeled) some records of the training set, the device may then proceed to retrain the activity prediction model(s) using the new information.

[0089] FIG. 12 illustrates an example of a method 1200 for training a model. The method 1200 may correspond to various instructions stored and executed by a device for training the various prediction models (e.g., pain prediction model, pain estimation model, activity prediction model, activity determination model, etc.) described here that are subsequently delivered to the service broker framework for application. Various alternative approaches to model training will be apparent such as, for example, programmer-defined algorithms, neural networks, Bayesian networks, etc.

[0090] The method begins in step 1202 and proceeds to step 1204 where the device obtains a labeled data set for a given parameter for which a model is to be created. Various alternative approaches for training a model from an unlabeled training set will be apparent. In various embodiments, the training set may include a number of records of training examples that specify one or more features (e.g., user demographics, available sensor devices, retrieved parameters, etc.) and the appropriate conclusion to be drawn from that feature set. In various embodiments, the training set may be created from real- world data gathering activities such as data gathered during various users' interactions with the service broker framework. For example, each time a user views, accepts, or positively rates an offering (or service associated therewith), that user's current feature set may be captured as a training example in association with a label indicating that the offering is relevant to the user (such as a 1 or a normalized rating provided by the user). Conversely, each time a user is displayed but does not accept an offer, manually sets an ignore indication for the offering, or negatively rates an offering, that user's current feature set may be captured as a training example in association with a label indicating that the offering is not relevant to the user (such as a 0 or a normalized rating provided by the user). Various additional methods for constructing training sets for various predictive models for predicting relevance of an offering (or class of offerings) will be apparent.

[0091] In step 1206, the device identifies the number of features identified in the data set and, in step 1208, initializes a set of coefficients to be used in the resulting model. According to various embodiments, a coefficient is created for each feature along with one additional coefficient to serve as a constant. Where the model is being trained to output a numerical value, a linear regression approach may be utilized, wherein the final model function may take the form of

h(X) = θ0 + θι Χι + θ2χ2 ...

where X is the set of features {xi, xi, ...} and the coefficients { θο, θι, θ∑, ...} are to be tuned by the method 1200 to provide as output an appropriate relevance estimation, consistent with the trends learned from the training data set. In some embodiments the final model function may incorporate a sigmoid function as follows: ^ + e -h(60+ θιΧι2χ2... )^

where turiing of the coefficients results in the function h(X) outputting a value between 0 and 1 that serves as an estimation of the relevance of an offering. According to various embodiments, the coefficients are all initialized to values of zero. It will be apparent that in some embodiments, additional features for inclusion in h(X) (and associated coefficients) may be constructed from the features in the training set such as, for example,

[0092] The method 1200 begins to train the coefficients by initializing two loop variables, i and p, to 0 in steps 1210, 1212 respectively. Then, in step 1214, the device obtains a partial derivative of the cost function, J(9), on the current coefficient, ΘΡ, where the cost function may be defined in some embodiments as

m

Κθ) = ½ (∑ *ω - y( )2

7=1

where m is the number of training examples in the training data set, he(x) is the trained function using the current coefficient set Θ, x® is the set of features for the jth training example, and y® is the desired output (i.e., the label) for the jth training example. Thus, following a batch gradient descent approach, the partial derivative on coefficient p (ΘΡ) may be

m

(y - he (xW )x

7 = 1

where xP© is the pth feature in the jth training example (or when p=0, xP® =1).

[0093] In step 1216, the device increments p and, in step 1218, the device determines whether all coefficients have been addressed in the current loop by determining whether p now exceeds the total number of features to be included in h(X). If not, the method loops back around to step 1214 to find the next partial derivative term.

[0094] After all partial derivatives are found for the current iteration, the method

1100 proceeds to reset the loop variable p to zero in step 1220. Then, in step 1222, the device updates the pth coefficient, ΘΡ, based on the corresponding partial derivative found in step 1214 and based on a preset learning rate. For example, the device may apply the following update rule:

Figure imgf000033_0001

where a is a learning rate such as, for example, 0.1, 0.3, 1 or any other value

appropriately selected for the desired rate of change on each iteration.

[0095] In step 1224, the device increments p and, in step 1226, the device determines whether all coefficients have been addressed in the current loop by determining whether p now exceeds the total number of features to be included in h(X). If not, the method loops back around to step 1222 to update the next coefficient. Note that according to the method 1200, all partial derivatives are found in a first loop prior to actually modifying the coefficients in a second loop so that the partial derivatives are not taken based on the partially updated values. Other embodiments may not implement such a "simultaneous" update of the coefficients.

[0096] After all coefficients are updated, the method proceeds to step 1228 where the variable i is incremented. In step 1230, the device determines whether i now exceeds a predefined maximum number of iterations to ensure that the method 1200 does not loop indefinitely. A sufficiently high maximum number of iterations may be chosen such as 1000, 5000, 100000, etc. If the maximum iterations has not been reached, the method 1200 proceeds to step 1232 where the device computes the current cost, using the cost function J(9), based on the training set. In step 1234, the device determines whether the function h(X) has converged to an acceptable solution by determining whether the change in the cost from the last iteration to the present iteration fails to meet a minimum threshold. If the change surpassed the threshold the method loops back to step 1212 to perform another coefficient update loop. If, on the other hand, the maximum iterations is reached or the cost change is below the minimum threshold, the method 1200 proceeds to step 1236, where the device stores the coefficients as part of the new model for extracting the parameter and the method 1200 proceeds to end in step 1238. [0097] It will be apparent that, in addition to following approaches other than regression, other embodiments may utilize different methods for tuning coefficients in a regression approach other than batch gradient descent. For example, some embodiments may use stochastic gradient descent, wherein each coefficient update is performed based on a single training example (thereby removing the summation from the partial derivative), and the method additionally iterates through each such example. In other embodiments, the normal equations for regression may be used to find appropriate coefficients, using a matrix-based, non-iterative approach where the set of coefficients is computed as

Θ = (xTxy1xTy

where X is a matrix of features from all training examples and y is the associated vector of labels.

[0098] It should be apparent from the foregoing description that various example embodiments of the invention may be implemented in hardware or firmware.

Furthermore, various exemplary embodiments may be implemented as instructions stored on a machine-readable storage medium, which may be read and executed by at least one processor to perform the operations described in detail herein. A machine- readable storage medium may include any mechanism for storing information in a form readable by a machine, such as a personal or laptop computer, a server, or other computing device. Thus, a machine-readable storage medium may include read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and similar storage media.

[0099] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in machine readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

[00100] Although the various exemplary embodiments have been described in detail with particular reference to certain exemplary aspects thereof, it should be understood that the invention is capable of other embodiments and its details are capable of modifications in various obvious respects. As is readily apparent to those skilled in the art, variations and modifications can be affected while remaining within the spirit and scope of the invention. Accordingly, the foregoing disclosure, description, and figures are for illustrative purposes only and do not in any way limit the invention, which is defined only by the claims.

Claims

CLAIMS What is claimed is:
1. A method performed by a processor for predicting pain in a user of a wearable device, the method comprising:
receiving a first environmental parameter descriptive of an environment of the user, wherein the first environmental parameter is based on data gathered by a sensor of the wearable device;
receiving a first activity indication that identifies an activity performed by the user;
receiving an indication of a pain level of the user;
correlating the first environmental parameter, the first activity indication, and the indication of the pain level as a training record;
training a pain prediction model using the training record;
receiving a second environmental parameter;
receiving a second activity indication;
applying the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and
presenting the predicted pain level to the user.
2. The method of claim 1, wherein the second environmental parameter is a predicted environmental parameter received from a forecasting server.
3. The method of claim 1, wherein:
the first environmental parameter is descriptive of the environment of the user at a first time, and the indication of the pain level is descriptive of the pain level of the user at a second time that is later than the first time,
whereby the correlating step selects the indication for correlation with the first environmental parameter based on a time difference between the first time and the second time.
4. The method of claim 1, further comprising:
identifying a recommended activity based on application of the pain prediction model to the second environmental parameter and the recommended activity indicating an alternative pain level that is lower than the predicted pain level; and
presenting the recommended activity to the user.
5. The method of claim 1, wherein the second activity indication is a predicted activity generated by application of an activity prediction model.
6. The method of claim 1, further comprising:
receiving accelerometer data from an accelerometer of the wearable device; applying an activity model to the accelerometer data to generate the first activity indication.
7. The method of claim 1, wherein receiving an indication of a pain level of the user comprises:
receiving a physiological parameter descriptive of the user, wherein the physiological parameter is based on additional data gathered by an additional sensor of the wearable device; and
applying a pain estimation model to the physiological parameter to generate the indication of the pain level.
8. A non-transitory machine-readable medium encoded with instructions for execution by a processor for receiving a first environmental parameter descriptive of an environment of the user, wherein the first environmental parameter is based on data gathered by a sensor of the wearable device;
instructions for receiving a first activity indication that identifies an activity performed by the user;
instructions for receiving an indication of a pain level of the user;
instructions for correlating the first environmental parameter, the first activity indication, and the indication of the pain level as a training record;
instructions for training a pain prediction model using the training record;
instructions for receiving a second environmental parameter;
instructions for receiving a second activity indication;
instructions for applying the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and instructions for presenting the predicted pain level to the user.
9. The non-transitory machine-readable medium of claim 8, wherein the second environmental parameter is a predicted environmental parameter received from a forecasting server.
10. The non-transitory machine-readable medium of claim 8, wherein:
the first environmental parameter is descriptive of the environment of the user at a first time, and
the indication of the pain level is descriptive of the pain level of the user at a second time that is later than the first time,
whereby the instructions for correlating comprise instructions for selecting the indication for correlation with the first environmental parameter based on a time difference between the first time and the second time.
11. The non-transitory machine-readable medium of claim 8, further comprising: instructions for identifying a recommended activity based on application of the pain prediction model to the second environmental parameter and the recommended activity indicating an alternative pain level that is lower than the predicted pain level; and
instructions for presenting the recommended activity to the user.
12. The non-transitory machine-readable medium of claim 8, wherein the second activity indication is a predicted activity generated by application of an activity prediction model.
13. The non-transitory machine-readable medium of claim 8, further comprising: instructions for receiving accelerometer data from an accelerometer of the wearable device;
instructions for applying an activity model to the accelerometer data to generate the first activity indication.
14. The non-transitory machine-readable medium of claim 8, wherein the instructions for receiving an indication of a pain level of the user comprise:
instructions for receiving a physiological parameter descriptive of the user, wherein the physiological parameter is based on additional data gathered by an additional sensor of the wearable device; and
instructions for applying a pain estimation model to the physiological parameter to generate the indication of the pain level.
15. A wearable device for predicting pain in a user of a wearable device, the method comprising:
an environmental sensor configured to sense environment data;
an accelerometer configured to sense motion data;
a memory; and
a processor configured to:
determine a first environmental parameter descriptive of an environment of the user based on the environment data,
determine a first activity indication that identifies an activity performed by the user based on the motion data;
receive an indication of a pain level of the user;
correlate the first environmental parameter, the first activity indication, and the indication of the pain level as a training record;
train a pain prediction model using the training record;
receive a second environmental parameter comprising at least one of: determining the second environment parameter based on additional environment data from the environmental sensor, and
receiving the second environmental parameter as a forecasted parameter from a remote server;
determine a second activity indication;
apply the pain prediction model to the second environmental parameter and the second activity indication to generate a predicted pain level; and
present the predicted pain level to the user.
PCT/IB2016/051624 2015-03-23 2016-03-23 Environment-based pain prediction wearable WO2016151494A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201562137186 true 2015-03-23 2015-03-23
US62/137,186 2015-03-23

Publications (1)

Publication Number Publication Date
WO2016151494A1 true true WO2016151494A1 (en) 2016-09-29

Family

ID=55702034

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/051624 WO2016151494A1 (en) 2015-03-23 2016-03-23 Environment-based pain prediction wearable

Country Status (1)

Country Link
WO (1) WO2016151494A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030144829A1 (en) * 2002-01-25 2003-07-31 Geatz Michael W. System and method for sensing and evaluating physiological parameters and modeling an adaptable predictive analysis for symptoms management
US20100318424A1 (en) * 2009-06-12 2010-12-16 L2La, Llc System for Correlating Physiological and Environmental Conditions
US20130036080A1 (en) * 2011-08-02 2013-02-07 Alcatel-Lucent Usa Inc. Method And Apparatus For A Predictive Tracking Device
US20140125481A1 (en) * 2012-11-06 2014-05-08 Aliphcom General health and wellness management method and apparatus for a wellness application using data associated with a data-capable band

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030144829A1 (en) * 2002-01-25 2003-07-31 Geatz Michael W. System and method for sensing and evaluating physiological parameters and modeling an adaptable predictive analysis for symptoms management
US20100318424A1 (en) * 2009-06-12 2010-12-16 L2La, Llc System for Correlating Physiological and Environmental Conditions
US20130036080A1 (en) * 2011-08-02 2013-02-07 Alcatel-Lucent Usa Inc. Method And Apparatus For A Predictive Tracking Device
US20140125481A1 (en) * 2012-11-06 2014-05-08 Aliphcom General health and wellness management method and apparatus for a wellness application using data associated with a data-capable band

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Similar Documents

Publication Publication Date Title
Lester et al. A practical approach to recognizing physical activities
US20140240122A1 (en) Notifications on a User Device Based on Activity Detected By an Activity Monitoring Device
US7327245B2 (en) Sensing and analysis of ambient contextual signals for discriminating between indoor and outdoor locations
US20100234693A1 (en) Activity monitoring device and method
US20120326873A1 (en) Activity attainment method and apparatus for a wellness application using data from a data-capable band
US20120313776A1 (en) General health and wellness management method and apparatus for a wellness application using data from a data-capable band
US20140085077A1 (en) Sedentary activity management method and apparatus using data from a data-capable band for managing health and wellness
Schwickert et al. Fall detection with body-worn sensors
US8952818B1 (en) Fall detection apparatus with floor and surface elevation learning capabilites
US20140363797A1 (en) Method for providing wellness-related directives to a user
US20080276186A1 (en) Method and system for adapting a user interface of a device
US20080183049A1 (en) Remote management of captured image sequence
US20130234853A1 (en) Comprehensive system and method of universal real-time linking of real objects to a mchine, network, internet, or software service
US20140099614A1 (en) Method for delivering behavior change directives to a user
US20130141235A1 (en) General health and wellness management method and apparatus for a wellness application using data associated with data-capable band
US20150334772A1 (en) Contextual information usage in systems that include accessory devices
US20130002435A1 (en) Sleep management method and apparatus for a wellness application using data from a data-capable band
US20100179452A1 (en) Activity Monitoring Device and Method
US20140280125A1 (en) Method and system to build a time-sensitive profile
Hwang et al. Landmark detection from mobile life log using a modular Bayesian network model
JP2008003655A (en) Information processor, information processing method, and program
US20120246102A1 (en) Adaptive analytical behavioral and health assistant system and related method of use
US20110234406A1 (en) Signature analysis systems and methods
US20090212957A1 (en) Portable monitoring apparatus with over the air programming and sampling volume collection cavity
US20150294595A1 (en) Method for providing wellness-related communications to a user

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16715619

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16715619

Country of ref document: EP

Kind code of ref document: A1